978 resultados para Eddy Current Testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Temporally-growing frontal meandering and occasional eddy-shedding is observed in the Brazil Current (BC) as it flows adjacent to the Brazilian Coast. No study of the dynamics of this phenomenon has been conducted to date in the region between 22 degrees S and 25 degrees S. Within this latitude range, the flow over the intermediate continental slope is marked by a current inversion at a depth that is associated with the Intermediate Western Boundary Current (IWBC). A time series analysis of 10-current-meter mooring data was used to describe a mean vertical profile for the BC-IWBC jet and a typical meander vertical structure. The latter was obtained by an empirical orthogonal function (EOF) analysis that showed a single mode explaining 82% of the total variance. This mode structure decayed sharply with depth, revealing that the meandering is much more vigorous within the BC domain than it is in the IWBC region. As the spectral analysis of the mode amplitude time series revealed no significant periods, we searched for dominant wavelengths. This search was done via a spatial EOF analysis on 51 thermal front patterns derived from digitized AVHRR images. Four modes were statistically significant at the 95% confidence level. Modes 3 and 4, which together explained 18% of the total variance, are associated with 266 and 338-km vorticity waves, respectively. With this new information derived from the data, the [Johns, W.E., 1988. One-dimensional baroclinically unstable waves on the Gulf Stream potential vorticity gradient near Cape Hatteras. Dyn. Atmos. Oceans 11, 323-350] one-dimensional quasi-geostrophic model was applied to the interpolated mean BC-IWBC jet. The results indicated that the BC system is indeed baroclinically unstable and that the wavelengths depicted in the thermal front analysis are associated with the most unstable waves produced by the model. Growth rates were about 0.06 (0.05) days(-1) for the 266-km (338-km) wave. Moreover, phase speeds for these waves were low compared to the surface BC velocity and may account for remarks in the literature about growing standing or stationary meanders off southeast Brazil. The theoretical vertical structure modes associated with these waves resembled very closely to the one obtained for the current-meter mooring EOF analysis. We interpret this agreement as a confirmation that baroclinic instability is an important mechanism in meander growth in the BC system. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims to discuss and test the hypothesis raised by Fusar-Poli [Fusar-Poli P. Can neuroimaging prove that schizophrenia is a brain disease? A radical hypothesis. Medical Hypotheses in press, corrected proof] that ""on the basis of the available imaging literature there is no consistent evidence to reject the radical and provocative hypothesis that schizophrenia is not a brain disease"". To achieve this goal, all meta-analyses on `fMRI and schizophrenia` published during the current decade and indexed in Pubmed were summarized, as much as some other useful information, e.g., meta-analyses on genetic risk factors. Our main conclusion is that the literature fully supports the hypothesis that schizophrenia is a syndrome (not a disease) associated with brain abnormalities, despite the fact that there is no singular and reductionist pathway from the nosographic entity (schizophrenia) to its causes. This irreducibility is due to the fact that the syndrome has more than one dimension (e.g., cognitive, psychotic and negative) and each of them is related to abnormalities in specific neuronal networks. A psychiatric diagnosis is a statistical procedure; these dimensions are not identically represented in each diagnosticated case and this explains the existence of more than one pattern of brain abnormalities related to schizophrenia. For example, chronification is associated with negativism while the first psychotic episode is not; in that sense, the same person living with schizophrenia may reveal different symptoms and fMRI patterns along the course of his life, and this is precisely what defines schizophrenia since the time when it was called Dementia Praecox (first by pick then by Kraepelin). It is notable that 100% of the collected meta-analyses on `fMRI and schizophrenia` reveal positive findings. Moreover, all meta-analyses that found positive associations between schizophrenia and genetic risk factors have to do with genes (SNPs) especially activated in neuronal tissue of the central nervous system (CNS), suggesting that, to the extent these polymorphisms are related to schizophrenia`s etiology, they are also related to abnormal brain activity. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drug testing is used by employers to detect drug use by employees or job candidates. It can identify recent use of alcohol, prescription drugs, and illicit drugs as a screening tool for potential health and safety and performance issues. Urine is the most commonly used sample for illicit drugs. It detects the use of a drug within the last few days and as such is evidence of recent use; but a positive test does not necessarily mean that the individual was impaired at the time of the test. Abstention from use for three days will often produce a negative test result. Analysis of hair provides a much longer window of detection, typically 1 to 3 months. Hence the likelihood of a falsely negative test using hair is very much less than with a urine test. Conversely, a negative hair test is a substantially stronger indicator of a non-drug user than a negative urine test. Oral fluid (saliva) is also easy to collect. Drugs remain in oral fluid for a similar time as in blood. The method is a good way of detecting current use and is more likely to reflect current impairment. It offers promise as a test in post-accident, for cause, and on-duty situations. Studies have shown that within the same industrial settings, hair testing can detect twice as many drug users as urine testing. Copyright (C) 2012 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]Oceanic eddy generation by tall deep-water islands is common phenomenon. It is recognized that these eddies may have a significant impact on the marine system and related biogeochemical fluxes. Hence, it is important to establish favourable conditions for their generation. With this objective, we present an observational study on eddy generation mechanisms by tall deep-water islands, using as a case study the island of Gran Canaria. Observations show that the main generation mechanism is topographic forcing, which leads to eddy generation when the incident oceanic flow is sufficiently intense. Wind shear at the island wake may acts only as an additional eddy-generation trigger mechanism when the impinging oceanic flow is not sufficiently intense. For the case of the island of Gran Canaria we have observed a mean of ten generated cyclonic eddies per year. Eddies are more frequently generated in summer coinciding with intense Trade winds and Canary Current.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Székesfehérvár Ruin Garden is a unique assemblage of monuments belonging to the cultural heritage of Hungary due to its important role in the Middle Ages as the coronation and burial church of the Kings of the Hungarian Christian Kingdom. It has been nominated for “National Monument” and as a consequence, its protection in the present and future is required. Moreover, it was reconstructed and expanded several times throughout Hungarian history. By a quick overview of the current state of the monument, the presence of several lithotypes can be found among the remained building and decorative stones. Therefore, the research related to the materials is crucial not only for the conservation of that specific monument but also for other historic structures in Central Europe. The current research is divided in three main parts: i) description of lithologies and their provenance, ii) physical properties testing of historic material and iii) durability tests of analogous stones obtained from active quarries. The survey of the National Monument of Székesfehérvár, focuses on the historical importance and the architecture of the monument, the different construction periods, the identification of the different building stones and their distribution in the remaining parts of the monument and it also included provenance analyses. The second one was the in situ and laboratory testing of physical properties of historic material. As a final phase samples were taken from local quarries with similar physical and mineralogical characteristics to the ones used in the monument. The three studied lithologies are: fine oolitic limestone, a coarse oolitic limestone and a red compact limestone. These stones were used for rock mechanical and durability tests under laboratory conditions. The following techniques were used: a) in-situ: Schmidt Hammer Values, moisture content measurements, DRMS, mapping (construction ages, lithotypes, weathering forms) b) laboratory: petrographic analysis, XRD, determination of real density by means of helium pycnometer and bulk density by means of mercury pycnometer, pore size distribution by mercury intrusion porosimetry and by nitrogen adsorption, water absorption, determination of open porosity, DRMS, frost resistance, ultrasonic pulse velocity test, uniaxial compressive strength test and dynamic modulus of elasticity. The results show that initial uniaxial compressive strength is not necessarily a clear indicator of the stone durability. Bedding and other lithological heterogeneities can influence the strength and durability of individual specimens. In addition, long-term behaviour is influenced by exposure conditions, fabric and, especially, the pore size distribution of each sample. Therefore, a statistic evaluation of the results is highly recommended and they should be evaluated in combination with other investigations on internal structure and micro-scale heterogeneities of the material, such as petrographic observation, ultrasound pulse velocity and porosimetry. Laboratory tests used to estimate the durability of natural stone may give a good guidance to its short-term performance but they should not be taken as an ultimate indication of the long-term behaviour of the stone. The interdisciplinary study of the results confirms that stones in the monument show deterioration in terms of mineralogy, fabric and physical properties in comparison with quarried stones. Moreover stone-testing proves compatibility between quarried and historical stones. Good correlation is observed between the non-destructive-techniques and laboratory tests results which allow us to minimize sampling and assessing the condition of the materials. Concluding, this research can contribute to the diagnostic knowledge for further studies that are needed in order to evaluate the effect of recent and future protective measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the improvement of current neutron capture therapy, several liposomal formulations of neutron capture agent gadolinium were developed and tested in a glioma cell model. Formulations were analyzed regarding physicochemical and biological parameters, such as size, zeta potential, uptake into cancer cells and performance under neutron irradiation. The neutron and photon dose derived from intracellular as well as extracellular Gd was calculated via Monte Carlo simulations and set in correlation with the reduction of cell survival after irradiation. To investigate the suitability of Gd as a radiosensitizer for photon radiation, cells were also irradiated with synchrotron radiation in addition to clinically used photons generated by linear accelerator.rnIrradiation with neutrons led to significantly lower survival for Gd-liposome-treated F98 and LN229 cells, compared to irradiated control cells and cells treated with non-liposomal Gd-DTPA. Correlation between Gd-content and -dose and respective cell survival displayed proportional relationship for most of the applied formulations. Photon irradiation experiments showed the proof-of-principle for the radiosensitizer approach, although the photon spectra currently used have to be optimized for higher efficiency of the radiosensitizer. In conclusion, the newly developed Gd-liposomes show great potential for the improvement of radiation treatment options for highly malignant glioblastoma.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dual-systems theorists posit distinct modes of reasoning. The intuition system reasons automatically and its processes are unavailable to conscious introspection. The deliberation system reasons effortfully while its processes recruit working memory. The current paper extends the application of such theories to the study of Obsessive-Compulsive Disorder (OCD). Patients with OCD often retain insight into their irrationality, implying dissociable systems of thought: intuition produces obsessions and fears that deliberation observes and attempts (vainly) to inhibit. To test the notion that dual-systems theory can adequately describe OCD, we obtained speeded and unspeeded risk judgments from OCD patients and non-anxious controls in order to quantify the differential effects of intuitive and deliberative reasoning. As predicted, patients deemed negative events to be more likely than controls. Patients also took more time in producing judgments than controls. Furthermore, when forced to respond quickly patients' judgments were more affected than controls'. Although patients did attenuate judgments when given additional time, their estimates never reached the levels of controls'. We infer from these data that patients have genuine difficulty inhibiting their intuitive cognitive system. Our dual-systems perspective is compatible with current theories of the disorder. Similar behavioral tests may prove helpful in better understanding related anxiety disorders. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In cardiac muscle the amplitude of Ca(2+) transients can be increased by enhancing Ca(2+) influx. Among the processes leading to increased Ca(2+) influx, agonists of the L-type Ca(2+)-channel can play an important role. Known pharmacological Ca(2+)-channel agonists act on different binding sites on the channel protein, which may lead not only to enhanced peak currents, but also to distinct changes in other biophysical characteristics of the current. In this study, membrane currents were recorded with the patch-clamp technique in the whole-cell configuration in guinea pig isolated ventricular myocytes in combination with confocal fluorescence Ca(2+) imaging techniques and a variety of pharmacological tools. Testing a new positive inotropic steroid-like compound, we found that it increased the L-type Ca(2+)-current by 2.5-fold by shifting the voltage-dependence of activation by 20.2 mV towards negative potentials. The dose-response relationship revealed two vastly different affinities (EC(50(high-affinity))=4.5+/-1.7 nM, EC(50(low-affinity))=8.0+/-1.1 microM) exhibiting differential pharmacological interactions with three classes of Ca(2+)-current antagonists, suggesting more than one binding site on the channel protein. Therefore, we identified and characterized a novel positive inotropic compound (F90927) as a member of a new class of Ca(2+)-channel agonists exhibiting unique features, which set it apart from other presently known L-type Ca(2+)-channel agonists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomarkers are currently best used as mechanistic "signposts" rather than as "traffic lights" in the environmental risk assessment of endocrine-disrupting chemicals (EDCs). In field studies, biomarkers of exposure [e.g., vitellogenin (VTG) induction in male fish] are powerful tools for tracking single substances and mixtures of concern. Biomarkers also provide linkage between field and laboratory data, thereby playing an important role in directing the need for and design of fish chronic tests for EDCs. It is the adverse effect end points (e.g., altered development, growth, and/or reproduction) from such tests that are most valuable for calculating adverseNOEC (no observed effect concentration) or adverseEC10 (effective concentration for a 10% response) and subsequently deriving predicted no effect concentrations (PNECs). With current uncertainties, biomarkerNOEC or biomarkerEC10 data should not be used in isolation to derive PNECs. In the future, however, there may be scope to increasingly use biomarker data in environmental decision making, if plausible linkages can be made across levels of organization such that adverse outcomes might be envisaged relative to biomarker responses. For biomarkers to fulfil their potential, they should be mechanistically relevant and reproducible (as measured by interlaboratory comparisons of the same protocol). VTG is a good example of such a biomarker in that it provides an insight to the mode of action (estrogenicity) that is vital to fish reproductive health. Interlaboratory reproducibility data for VTG are also encouraging; recent comparisons (using the same immunoassay protocol) have provided coefficients of variation (CVs) of 38-55% (comparable to published CVs of 19-58% for fish survival and growth end points used in regulatory test guidelines). While concern over environmental xenoestrogens has led to the evaluation of reproductive biomarkers in fish, it must be remembered that many substances act via diverse mechanisms of action such that the environmental risk assessment for EDCs is a broad and complex issue. Also, biomarkers such as secondary sexual characteristics, gonadosomatic indices, plasma steroids, and gonadal histology have significant potential for guiding interspecies assessments of EDCs and designing fish chronic tests. To strengthen the utility of EDC biomarkers in fish, we need to establish a historical control database (also considering natural variability) to help differentiate between statistically detectable versus biologically significant responses. In conclusion, as research continues to develop a range of useful EDC biomarkers, environmental decision-making needs to move forward, and it is proposed that the "biomarkers as signposts" approach is a pragmatic way forward in the current risk assessment of EDCs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PRINCIPLES: Coeliac disease (gluten sensitive enteropathy) is a genetically determined disorder with an incidence in the general population that is comparable to type 2 diabetes mellitus. Awareness of this fact and of the often atypical and oligosymptomatic manifestations is only now gaining ground in the medical profession. A high index of suspicion is important in order to minimise diagnostic and therapeutic delay. METHODS: Testing patterns and follow-up for coeliac disease in our institution have been analysed retrospectively for the past five years. The current literature was reviewed with respect to recommendations for clinical practice. RESULTS: A total of 271 patients were tested for coeliac disease over a period of five years. Only in 24 patients were positive results found; after further work-up, the final number of cases with certain or presumed coeliac disease was four. Followup was often difficult, many patients being lost after a single visit. CONCLUSIONS: This study showed that the number of tests ordered in our institution, more often for abdominal than atypical symptoms, has started to increase in the past two years. It also showed that screening tests have found their place in general clinical practice, while the final choice of tests needs to be determined in accordance with available guidelines and local resources. Upper endoscopy with small bowel biopsy remains the gold standard for diagnosis, but its place in follow-up is less certain. Coeliac disease is a disorder for which there is a definite treatment (gluten free diet); if it is left untreated diminished quality of life and potentially serious complications may ensue. Further education of the medical profession regarding coeliac disease, its incidence, presentation and treatment, is clearly indicated..

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a novel approach to making inference about the regression parameters in the accelerated failure time (AFT) model for current status and interval censored data. The estimator is constructed by inverting a Wald type test for testing a null proportional hazards model. A numerically efficient Markov chain Monte Carlo (MCMC) based resampling method is proposed to simultaneously obtain the point estimator and a consistent estimator of its variance-covariance matrix. We illustrate our approach with interval censored data sets from two clinical studies. Extensive numerical studies are conducted to evaluate the finite sample performance of the new estimators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determination of chloride concentration in sweat is the current diagnostic gold standard for Cystic Fibrosis (CF). Nanoduct(R) is a new analyzing system measuring conductivity which requires only 3 microliters of sweat and gives results within 30 minutes. The aim of the study was to evaluate the applicability of this system in a clinical setting of three children's hospitals and borderline results were compared with sweat chloride concentration. Over 3 years, 1,041 subjects were tested and in 946 diagnostic results were obtained. In 95 children, Nanoduct(R) failed (9.1% failure rate), mainly due to failures in preterm babies and newborns. Assuming 59 mmol/L as an upper limit of normal conductivity, all our 46 CF patients were correctly diagnosed (sensitivity 100%, 95% CI: 93.1-100; negative predicted value 100% (95% CI: 99.6-100) and only 39 non CF's were false positive (39/900, 4.3%; specificity 95.7%, 95%CI: 94.2-96.9, positive predicted value 54.1% with a 95%CI: 43.4-65.0). Increasing the diagnostic limit to 80 mmol/L, the rate fell to 0.3% (3/900). CF patients had a median conductivity of 115 mmol/L; the non-CF a median of 37 mmol/L. In conclusion, the Nanoduct(R) test is a reliable diagnostic tool for CF diagnosis: It has a failure rate comparable to other sweat tests and can be used as a simple bedside test for fast and reliable exclusion, diagnosis or suspicion of CF. In cases with borderline conductivity (60-80 mmol/L) other additional methods (determination of chloride and genotyping) are indicated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS: To determine whether the current practice of sweat testing in Swiss hospitals is consistent with the current international guidelines. METHODS: A questionnaire was mailed to all children's hospitals (n = 8), regional paediatric sections of general hospitals (n = 28), and all adult pulmonology centres (n = 8) in Switzerland which care for patients with cystic fibrosis (CF). The results were compared with published "guidelines 2000" of the American National Committee for Clinical Laboratory Standards (NCCLS) and the UK guidelines of 2003. RESULTS: The response rate was 89%. All 8 children's hospitals and 18 out of 23 answering paediatric sections performed sweat tests but none of the adult pulmonology centres. In total, 1560 sweat tests (range: 5-200 tests/centre/year, median 40) per year were done. 88% (23/26) were using Wescor systems, 73% (19/26) the Macroduct system for collecting sweat and 31% (8/26) the Nanoduct system. Sweat chloride was determined by only 62% (16/26) of all centres; of these, only 63% (10/16) indicated to use the recommended diagnostic chloride-CF-reference value of >60 mmol/l. Osmolality was measured in 35%, sodium in 42% and conductivity in 62% of the hospitals. Sweat was collected for maximal 30-120 (median 55) minutes; only three centres used the maximal 30 minutes sample time recommended by the international guidelines. CONCLUSIONS: Sweat testing practice in Swiss hospitals was inconsistent and seldom followed the current international guidelines for sweat collection, analyzing method and reference values. Only 62% were used the chloride concentration as a diagnostic reference, the only accepted diagnostic measurement by the NCCLS or UK guidelines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transformers are very important elements of any power system. Unfortunately, they are subjected to through-faults and abnormal operating conditions which can affect not only the transformer itself but also other equipment connected to the transformer. Thus, it is essential to provide sufficient protection for transformers as well as the best possible selectivity and sensitivity of the protection. Nowadays microprocessor-based relays are widely used to protect power equipment. Current differential and voltage protection strategies are used in transformer protection applications and provide fast and sensitive multi-level protection and monitoring. The elements responsible for detecting turn-to-turn and turn-to-ground faults are the negative-sequence percentage differential element and restricted earth-fault (REF) element, respectively. During severe internal faults current transformers can saturate and slow down the speed of relay operation which affects the degree of equipment damage. The scope of this work is to develop a modeling methodology to perform simulations and laboratory tests for internal faults such as turn-to-turn and turn-to-ground for two step-down power transformers with capacity ratings of 11.2 MVA and 290 MVA. The simulated current waveforms are injected to a microprocessor relay to check its sensitivity for these internal faults. Saturation of current transformers is also studied in this work. All simulations are performed with the Alternative Transients Program (ATP) utilizing the internal fault model for three-phase two-winding transformers. The tested microprocessor relay is the SEL-487E current differential and voltage protection relay. The results showed that the ATP internal fault model can be used for testing microprocessor relays for any percentage of turns involved in an internal fault. An interesting observation from the experiments was that the SEL-487E relay is more sensitive to turn-to-turn faults than advertized for the transformers studied. The sensitivity of the restricted earth-fault element was confirmed. CT saturation cases showed that low accuracy CTs can be saturated with a high percentage of turn-to-turn faults, where the CT burden will affect the extent of saturation. Recommendations for future work include more accurate simulation of internal faults, transformer energization inrush, and other scenarios involving core saturation, using the newest version of the internal fault model. The SEL-487E relay or other microprocessor relays should again be tested for performance. Also, application of a grounding bank to the delta-connected side of a transformer will increase the zone of protection and relay performance can be tested for internal ground faults on both sides of a transformer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been a continuous evolutionary process in asphalt pavement design. In the beginning it was crude and based on past experience. Through research, empirical methods were developed based on materials response to specific loading at the AASHO Road Test. Today, pavement design has progressed to a mechanistic-empirical method. This methodology takes into account the mechanical properties of the individual layers and uses empirical relationships to relate them to performance. The mechanical tests that are used as part of this methodology include dynamic modulus and flow number, which have been shown to correlate with field pavement performance. This thesis was based on a portion of a research project being conducted at Michigan Technological University (MTU) for the Wisconsin Department of Transportation (WisDOT). The global scope of this project dealt with the development of a library of values as they pertain to the mechanical properties of the asphalt pavement mixtures paved in Wisconsin. Additionally, a comparison with the current associated pavement design to that of the new AASHTO Design Guide was conducted. This thesis describes the development of the current pavement design methodology as well as the associated tests as part of a literature review. This report also details the materials that were sampled from field operations around the state of Wisconsin and their testing preparation and procedures. Testing was conducted on available round robin and three Wisconsin mixtures and the main results of the research were: The test history of the Superpave SPT (fatigue and permanent deformation dynamic modulus) does not affect the mean response for both dynamic modulus and flow number, but does increase the variability in the test results of the flow number. The method of specimen preparation, compacting to test geometry versus sawing/coring to test geometry, does not statistically appear to affect the intermediate and high temperature dynamic modulus and flow number test results. The 2002 AASHTO Design Guide simulations support the findings of the statistical analyses that the method of specimen preparation did not impact the performance of the HMA as a structural layer as predicted by the Design Guide software. The methodologies for determining the temperature-viscosity relationship as stipulated by Witczak are sensitive to the viscosity test temperatures employed. The increase in asphalt binder content by 0.3% was found to actually increase the dynamic modulus at the intermediate and high test temperature as well as flow number. This result was based the testing that was conducted and was contradictory to previous research and the hypothesis that was put forth for this thesis. This result should be used with caution and requires further review. Based on the limited results presented herein, the asphalt binder grade appears to have a greater impact on performance in the Superpave SPT than aggregate angularity. Dynamic modulus and flow number was shown to increase with traffic level (requiring an increase in aggregate angularity) and with a decrease in air voids and confirm the hypotheses regarding these two factors. Accumulated micro-strain at flow number as opposed to the use of flow number appeared to be a promising measure for comparing the quality of specimens within a specific mixture. At the current time the Design Guide and its associate software needs to be further improved prior to implementation by owner/agencies.