998 resultados para Similarity test


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pharmaceutical equivalence is an important step towards the confirmation of similarity and Interchangeability among pharmaceutical products, particularly regarding those that win not be tested for bioequivalence. The aim of this paper is to compare traditional difference testing to two one-side equivalence tests in the assessment of pharmaceutical equivalence, by means of equivalence studies between similar, generic and reference products of acyclovir cream, atropine sulfate injection, meropenem for injection, and metronidazole injection. All tests were performed in accordance with the Brazilian Pharmacopeia or the United States Pharmacopeia. All four possible combinations of results arise in these comparisons of difference testing and equivalence testing. Most of the former did not show significant difference, whereas the latter presented similarity. We concluded that equivalence testing is more appropriate than difference testing, what can make it a useful tool to assess pharmaceutical equivalence in products that will not be tested for bioequivalence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the first underground nuclear explosion, carried out in 1958, the analysis of seismic signals generated by these sources has allowed seismologists to refine the travel times of seismic waves through the Earth and to verify the accuracy of the location algorithms (the ground truth for these sources was often known). Long international negotiates have been devoted to limit the proliferation and testing of nuclear weapons. In particular the Treaty for the comprehensive nuclear test ban (CTBT), was opened to signatures in 1996, though, even if it has been signed by 178 States, has not yet entered into force, The Treaty underlines the fundamental role of the seismological observations to verify its compliance, by detecting and locating seismic events, and identifying the nature of their sources. A precise definition of the hypocentral parameters represents the first step to discriminate whether a given seismic event is natural or not. In case that a specific event is retained suspicious by the majority of the State Parties, the Treaty contains provisions for conducting an on-site inspection (OSI) in the area surrounding the epicenter of the event, located through the International Monitoring System (IMS) of the CTBT Organization. An OSI is supposed to include the use of passive seismic techniques in the area of the suspected clandestine underground nuclear test. In fact, high quality seismological systems are thought to be capable to detect and locate very weak aftershocks triggered by underground nuclear explosions in the first days or weeks following the test. This PhD thesis deals with the development of two different seismic location techniques: the first one, known as the double difference joint hypocenter determination (DDJHD) technique, is aimed at locating closely spaced events at a global scale. The locations obtained by this method are characterized by a high relative accuracy, although the absolute location of the whole cluster remains uncertain. We eliminate this problem introducing a priori information: the known location of a selected event. The second technique concerns the reliable estimates of back azimuth and apparent velocity of seismic waves from local events of very low magnitude recorded by a trypartite array at a very local scale. For the two above-mentioned techniques, we have used the crosscorrelation technique among digital waveforms in order to minimize the errors linked with incorrect phase picking. The cross-correlation method relies on the similarity between waveforms of a pair of events at the same station, at the global scale, and on the similarity between waveforms of the same event at two different sensors of the try-partite array, at the local scale. After preliminary tests on the reliability of our location techniques based on simulations, we have applied both methodologies to real seismic events. The DDJHD technique has been applied to a seismic sequence occurred in the Turkey-Iran border region, using the data recorded by the IMS. At the beginning, the algorithm was applied to the differences among the original arrival times of the P phases, so the cross-correlation was not used. We have obtained that the relevant geometrical spreading, noticeable in the standard locations (namely the locations produced by the analysts of the International Data Center (IDC) of the CTBT Organization, assumed as our reference), has been considerably reduced by the application of our technique. This is what we expected, since the methodology has been applied to a sequence of events for which we can suppose a real closeness among the hypocenters, belonging to the same seismic structure. Our results point out the main advantage of this methodology: the systematic errors affecting the arrival times have been removed or at least reduced. The introduction of the cross-correlation has not brought evident improvements to our results: the two sets of locations (without and with the application of the cross-correlation technique) are very similar to each other. This can be commented saying that the use of the crosscorrelation has not substantially improved the precision of the manual pickings. Probably the pickings reported by the IDC are good enough to make the random picking error less important than the systematic error on travel times. As a further justification for the scarce quality of the results given by the cross-correlation, it should be remarked that the events included in our data set don’t have generally a good signal to noise ratio (SNR): the selected sequence is composed of weak events ( magnitude 4 or smaller) and the signals are strongly attenuated because of the large distance between the stations and the hypocentral area. In the local scale, in addition to the cross-correlation, we have performed a signal interpolation in order to improve the time resolution. The algorithm so developed has been applied to the data collected during an experiment carried out in Israel between 1998 and 1999. The results pointed out the following relevant conclusions: a) it is necessary to correlate waveform segments corresponding to the same seismic phases; b) it is not essential to select the exact first arrivals; and c) relevant information can be also obtained from the maximum amplitude wavelet of the waveforms (particularly in bad SNR conditions). Another remarkable point of our procedure is that its application doesn’t demand a long time to process the data, and therefore the user can immediately check the results. During a field survey, such feature will make possible a quasi real-time check allowing the immediate optimization of the array geometry, if so suggested by the results at an early stage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'indagine ha riguardato il profilo del vento nei primi 30 metri dello strato limite atmosferico stabile nell'ambito della teoria di similarità locale. Ad oggi, diversi esperimenti hanno confermato la validità della teoria per strati-limite su terreni livellati e superfici omogenee. Tali condizioni ideali sono però infrequenti nella realtà ed è perciò importante capire quali siano i limiti della similarità locale per strati-limite su terreni complessi e superfici disomogenee. Entrambe le condizioni sono presenti a Ny-Alesund (Svalbard, Norvegia) dove il Consiglio Nazionale delle Ricerche (CNR), nel 2009, ha installato una torre di 30 m, la Amudsen-Nobile Climate Change Tower (CCT), per lo studio dello strato-limite artico. Il lavoro di tesi ha riguardato misure di vento e turbolenza acquisite sulla CCT da maggio 2012 a maggio 2014. Il confronto tra le velocità del vento misurate dagli anemometri installati sulla CCT, ha rivelato criticità nel dato sonico manifestatesi con sovrastime sistematiche e maggiore erraticità rispetto alle misure provenienti dagli anemometri a elica. Un test condotto fra diversi metodi per il calcolo dei gradienti verticali della velocità del vento ha rivelato scarsa sensibilità dei risultati ottenuti al particolare metodo utilizzato. Lo studio ha riguardato i gradienti verticali adimensionali della velocità del vento nei primi 30-m dello strato limite stabile. Deviazioni significative tra i tra le osservazioni e i valori predetti dalla similarità locale sono state osservate in particolare per i livelli più distanti dal suolo e per valori crescenti del parametro di stabilità z/L (L, lunghezza di Obukhov locale). In particolare, si sono osservati gradienti adimensionali inferiori a quelli predetti dalle più usate relazioni di flusso-gradiente. Tali deviazioni, presenti perlopiù per z/L>0.1, sono state associate ad un effetto di accentuazione della turbolenza da parte delle irregolarità del terreno. Per condizioni meno stabili, z/L<0.1, scarti positivi tra i gradienti osservati e quelli attesi sono stati attribuiti alla formazione di strati limite interni in condizioni di vento dal mare verso la costa. Sono stati proposti diversi metodi per la stima dell'effetto della self-correlazione nella derivazione delle relazioni di flusso-gradiente, dovuta alla condivisione della variabile u*. La formula per il coefficiente lineare di self correlazione e le sue distribuzioni di probabilità empiriche sono state derivate e hanno permesso di stimare il livello di self-correlazione presente nel dataset considerato.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resting-state functional connectivity (FC) fMRI (rs-fcMRI) offers an appealing approach to mapping the brain's intrinsic functional organization. Blood oxygen level dependent (BOLD) and arterial spin labeling (ASL) are the two main rs-fcMRI approaches to assess alterations in brain networks associated with individual differences, behavior and psychopathology. While the BOLD signal is stronger with a higher temporal resolution, ASL provides quantitative, direct measures of the physiology and metabolism of specific networks. This study systematically investigated the similarity and reliability of resting brain networks (RBNs) in BOLD and ASL. A 2×2×2 factorial design was employed where each subject underwent repeated BOLD and ASL rs-fcMRI scans on two occasions on two MRI scanners respectively. Both independent and joint FC analyses revealed common RBNs in ASL and BOLD rs-fcMRI with a moderate to high level of spatial overlap, verified by Dice Similarity Coefficients. Test-retest analyses indicated more reliable spatial network patterns in BOLD (average modal Intraclass Correlation Coefficients: 0.905±0.033 between-sessions; 0.885±0.052 between-scanners) than ASL (0.545±0.048; 0.575±0.059). Nevertheless, ASL provided highly reproducible (0.955±0.021; 0.970±0.011) network-specific CBF measurements. Moreover, we observed positive correlations between regional CBF and FC in core areas of all RBNs indicating a relationship between network connectivity and its baseline metabolism. Taken together, the combination of ASL and BOLD rs-fcMRI provides a powerful tool for characterizing the spatiotemporal and quantitative properties of RBNs. These findings pave the way for future BOLD and ASL rs-fcMRI studies in clinical populations that are carried out across time and scanners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El extraordinario auge de las nuevas tecnologías de la información, el desarrollo de la Internet de las Cosas, el comercio electrónico, las redes sociales, la telefonía móvil y la computación y almacenamiento en la nube, han proporcionado grandes beneficios en todos los ámbitos de la sociedad. Junto a éstos, se presentan nuevos retos para la protección y privacidad de la información y su contenido, como la suplantación de personalidad y la pérdida de la confidencialidad e integridad de los documentos o las comunicaciones electrónicas. Este hecho puede verse agravado por la falta de una frontera clara que delimite el mundo personal del mundo laboral en cuanto al acceso de la información. En todos estos campos de la actividad personal y laboral, la Criptografía ha jugado un papel fundamental aportando las herramientas necesarias para garantizar la confidencialidad, integridad y disponibilidad tanto de la privacidad de los datos personales como de la información. Por otro lado, la Biometría ha propuesto y ofrecido diferentes técnicas con el fin de garantizar la autentificación de individuos a través del uso de determinadas características personales como las huellas dáctilares, el iris, la geometría de la mano, la voz, la forma de caminar, etc. Cada una de estas dos ciencias, Criptografía y Biometría, aportan soluciones a campos específicos de la protección de datos y autentificación de usuarios, que se verían enormemente potenciados si determinadas características de ambas ciencias se unieran con vistas a objetivos comunes. Por ello es imperativo intensificar la investigación en estos ámbitos combinando los algoritmos y primitivas matemáticas de la Criptografía con la Biometría para dar respuesta a la demanda creciente de nuevas soluciones más técnicas, seguras y fáciles de usar que potencien de modo simultáneo la protección de datos y la identificacíón de usuarios. En esta combinación el concepto de biometría cancelable ha supuesto una piedra angular en el proceso de autentificación e identificación de usuarios al proporcionar propiedades de revocación y cancelación a los ragos biométricos. La contribución de esta tesis se basa en el principal aspecto de la Biometría, es decir, la autentificación segura y eficiente de usuarios a través de sus rasgos biométricos, utilizando tres aproximaciones distintas: 1. Diseño de un esquema criptobiométrico borroso que implemente los principios de la biometría cancelable para identificar usuarios lidiando con los problemas acaecidos de la variabilidad intra e inter-usuarios. 2. Diseño de una nueva función hash que preserva la similitud (SPHF por sus siglas en inglés). Actualmente estas funciones se usan en el campo del análisis forense digital con el objetivo de buscar similitudes en el contenido de archivos distintos pero similares de modo que se pueda precisar hasta qué punto estos archivos pudieran ser considerados iguales. La función definida en este trabajo de investigación, además de mejorar los resultados de las principales funciones desarrolladas hasta el momento, intenta extender su uso a la comparación entre patrones de iris. 3. Desarrollando un nuevo mecanismo de comparación de patrones de iris que considera tales patrones como si fueran señales para compararlos posteriormente utilizando la transformada de Walsh-Hadarmard. Los resultados obtenidos son excelentes teniendo en cuenta los requerimientos de seguridad y privacidad mencionados anteriormente. Cada uno de los tres esquemas diseñados han sido implementados para poder realizar experimentos y probar su eficacia operativa en escenarios que simulan situaciones reales: El esquema criptobiométrico borroso y la función SPHF han sido implementados en lenguaje Java mientras que el proceso basado en la transformada de Walsh-Hadamard en Matlab. En los experimentos se ha utilizado una base de datos de imágenes de iris (CASIA) para simular una población de usuarios del sistema. En el caso particular de la función de SPHF, además se han realizado experimentos para comprobar su utilidad en el campo de análisis forense comparando archivos e imágenes con contenido similar y distinto. En este sentido, para cada uno de los esquemas se han calculado los ratios de falso negativo y falso positivo. ABSTRACT The extraordinary increase of new information technologies, the development of Internet of Things, the electronic commerce, the social networks, mobile or smart telephony and cloud computing and storage, have provided great benefits in all areas of society. Besides this fact, there are new challenges for the protection and privacy of information and its content, such as the loss of confidentiality and integrity of electronic documents and communications. This is exarcebated by the lack of a clear boundary between the personal world and the business world as their differences are becoming narrower. In both worlds, i.e the personal and the business one, Cryptography has played a key role by providing the necessary tools to ensure the confidentiality, integrity and availability both of the privacy of the personal data and information. On the other hand, Biometrics has offered and proposed different techniques with the aim to assure the authentication of individuals through their biometric traits, such as fingerprints, iris, hand geometry, voice, gait, etc. Each of these sciences, Cryptography and Biometrics, provides tools to specific problems of the data protection and user authentication, which would be widely strengthen if determined characteristics of both sciences would be combined in order to achieve common objectives. Therefore, it is imperative to intensify the research in this area by combining the basics mathematical algorithms and primitives of Cryptography with Biometrics to meet the growing demand for more secure and usability techniques which would improve the data protection and the user authentication. In this combination, the use of cancelable biometrics makes a cornerstone in the user authentication and identification process since it provides revocable or cancelation properties to the biometric traits. The contributions in this thesis involve the main aspect of Biometrics, i.e. the secure and efficient authentication of users through their biometric templates, considered from three different approaches. The first one is designing a fuzzy crypto-biometric scheme using the cancelable biometric principles to take advantage of the fuzziness of the biometric templates at the same time that it deals with the intra- and inter-user variability among users without compromising the biometric templates extracted from the legitimate users. The second one is designing a new Similarity Preserving Hash Function (SPHF), currently widely used in the Digital Forensics field to find similarities among different files to calculate their similarity level. The function designed in this research work, besides the fact of improving the results of the two main functions of this field currently in place, it tries to expand its use to the iris template comparison. Finally, the last approach of this thesis is developing a new mechanism of handling the iris templates, considering them as signals, to use the Walsh-Hadamard transform (complemented with three other algorithms) to compare them. The results obtained are excellent taking into account the security and privacy requirements mentioned previously. Every one of the three schemes designed have been implemented to test their operational efficacy in situations that simulate real scenarios: The fuzzy crypto-biometric scheme and the SPHF have been implemented in Java language, while the process based on the Walsh-Hadamard transform in Matlab. The experiments have been performed using a database of iris templates (CASIA-IrisV2) to simulate a user population. The case of the new SPHF designed is special since previous to be applied i to the Biometrics field, it has been also tested to determine its applicability in the Digital Forensic field comparing similar and dissimilar files and images. The ratios of efficiency and effectiveness regarding user authentication, i.e. False Non Match and False Match Rate, for the schemes designed have been calculated with different parameters and cases to analyse their behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of similarity measurement of biological signals is considered on this article. The dynamic time warping algorithm is used as a possible solution. A short overview of this algorithm and its modifications are given. Testing procedure for different modifications of DTW, which are based on artificial test signals, are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Product reliability and its environmental performance have become critical elements within a product's specification and design. To obtain a high level of confidence in the reliability of the design it is customary to test the design under realistic conditions in a laboratory. The objective of the work is to examine the feasibility of designing mechanical test rigs which exhibit prescribed dynamical characteristics. The design is then attached to the rig and excitation is applied to the rig, which then transmits representative vibration levels into the product. The philosophical considerations made at the outset of the project are discussed as they form the basis for the resulting design methodologies. It is attempted to directly identify the parameters of a test rig from the spatial model derived during the system identification process. It is shown to be impossible to identify a feasible test rig design using this technique. A finite dimensional optimal design methodology is developed which identifies the parameters of a discrete spring/mass system which is dynamically similar to a point coordinate on a continuous structure. This design methodology is incorporated within another procedure which derives a structure comprising a continuous element and a discrete system. This methodology is used to obtain point coordinate similarity for two planes of motion, which is validated by experimental tests. A limitation of this approach is that it is impossible to achieve multi-coordinate similarity due to an interaction of the discrete system and the continuous element at points away from the coordinate of interest. During the work the importance of the continuous element is highlighted and a design methodology is developed for continuous structures. The design methodology is based upon distributed parameter optimal design techniques and allows an initial poor design estimate to be moved in a feasible direction towards an acceptable design solution. Cumulative damage theory is used to provide a quantitative method of assessing the quality of dynamic similarity. It is shown that the combination of modal analysis techniques and cumulative damage theory provides a feasible design synthesis methodology for representative test rigs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In April 2009, Google Images added a filter for narrowing search results by colour. Several other systems for searching image databases by colour were also released around this time. These colour-based image retrieval systems enable users to search image databases either by selecting colours from a graphical palette (i.e., query-by-colour), by drawing a representation of the colour layout sought (i.e., query-by-sketch), or both. It was comments left by readers of online articles describing these colour-based image retrieval systems that provided us with the inspiration for this research. We were surprised to learn that the underlying query-based technology used in colour-based image retrieval systems today remains remarkably similar to that of systems developed nearly two decades ago. Discovering this ageing retrieval approach, as well as uncovering a large user demographic requiring image search by colour, made us eager to research more effective approaches for colour-based image retrieval. In this thesis, we detail two user studies designed to compare the effectiveness of systems adopting similarity-based visualisations, query-based approaches, or a combination of both, for colour-based image retrieval. In contrast to query-based approaches, similarity-based visualisations display and arrange database images so that images with similar content are located closer together on screen than images with dissimilar content. This removes the need for queries, as users can instead visually explore the database using interactive navigation tools to retrieve images from the database. As we found existing evaluation approaches to be unreliable, we describe how we assessed and compared systems adopting similarity-based visualisations, query-based approaches, or both, meaningfully and systematically using our Mosaic Test - a user-based evaluation approach in which evaluation study participants complete an image mosaic of a predetermined target image using the colour-based image retrieval system under evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the development of information technology, the theory and methodology of complex network has been introduced to the language research, which transforms the system of language in a complex networks composed of nodes and edges for the quantitative analysis about the language structure. The development of dependency grammar provides theoretical support for the construction of a treebank corpus, making possible a statistic analysis of complex networks. This paper introduces the theory and methodology of the complex network and builds dependency syntactic networks based on the treebank of speeches from the EEE-4 oral test. According to the analysis of the overall characteristics of the networks, including the number of edges, the number of the nodes, the average degree, the average path length, the network centrality and the degree distribution, it aims to find in the networks potential difference and similarity between various grades of speaking performance. Through clustering analysis, this research intends to prove the network parameters’ discriminating feature and provide potential reference for scoring speaking performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the study was to develop a culturally adapted translation of the 12-item smell identification test from Sniffin' Sticks (SS-12) for the Estonian population in order to help diagnose Parkinson's disease (PD). A standard translation of the SS-12 was created and 150 healthy Estonians were questioned about the smells used as response options in the test. Unfamiliar smells were replaced by culturally familiar options. The adapted SS-12 was applied to 70 controls in all age groups, and thereafter to 50 PD patients and 50 age- and sex-matched controls. 14 response options from 48 used in the SS-12 were replaced with familiar smells in an adapted version, in which the mean rate of correct response was 87% (range 73-99) compared to 83% with the literal translation (range 50-98). In PD patients, the average adapted SS-12 score (5.4/12) was significantly lower than in controls (average score 8.9/12), p < 0.0001. A multiple linear regression using the score in the SS-12 as the outcome measure showed that diagnosis and age independently influenced the result of the SS-12. A logistic regression using the SS-12 and age as covariates showed that the SS-12 (but not age) correctly classified 79.0% of subjects into the PD and control category, using a cut-off of <7 gave a sensitivity of 76% and specificity of 86% for the diagnosis of PD. The developed SS-12 cultural adaption is appropriate for testing olfaction in Estonia for the purpose of PD diagnosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

to investigate the pulmonary response to exercise of non-morbidly obese adolescents, considering the gender. a prospective cross-sectional study was conducted with 92 adolescents (47 obese and 45 eutrophic), divided in four groups according to obesity and gender. Anthropometric parameters, pulmonary function (spirometry and oxygen saturation [SatO2]), heart rate (HR), blood pressure (BP), respiratory rate (RR), and respiratory muscle strength were measured. Pulmonary function parameters were measured before, during, and after the exercise test. BP and HR were higher in obese individuals during the exercise test (p = 0.0001). SatO2 values decreased during exercise in obese adolescents (p = 0.0001). Obese males had higher levels of maximum inspiratory and expiratory pressures (p = 0.0002) when compared to obese and eutrophic females. Obese males showed lower values of maximum voluntary ventilation, forced vital capacity, and forced expiratory volume in the first second when compared to eutrophic males, before and after exercise (p = 0.0005). Obese females had greater inspiratory capacity compared to eutrophic females (p = 0.0001). Expiratory reserve volume was lower in obese subjects when compared to controls (p ≤ 0,05). obese adolescents presented changes in pulmonary function at rest and these changes remained present during exercise. The spirometric and cardiorespiratory values were different in the four study groups. The present data demonstrated that, in spite of differences in lung growth, the model of fat distribution alters pulmonary function differently in obese female and male adolescents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To assess binocular detection grating acuity using the LEA GRATINGS test to establish age-related norms in healthy infants during their first 3 months of life. In this prospective, longitudinal study of healthy infants with clear red reflex at birth, responses to gratings were measured at 1, 2, and 3 months of age using LEA gratings at a distance of 28 cm. The results were recorded as detection grating acuity values, which were arranged in frequency tables and converted to a one-octave scale for statistical analysis. For the repeated measurements, analysis of variance (ANOVA) was used to compare the detection grating acuity results between ages. A total of 133 infants were included. The binocular responses to gratings showed development toward higher mean values and spatial frequencies, ranging from 0.55 ± 0.70 cycles per degree (cpd), or 1.74 ± 0.21 logMAR, in month 1 to 3.11 ± 0.54 cpd, or 0.98 ± 0.16 logMAR, in month 3. Repeated ANOVA indicated differences among grating acuity values in the three age groups. The LEA GRATINGS test allowed assessment of detection grating acuity and its development in a cohort of healthy infants during their first 3 months of life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To assess the neurodevelopmental functions (cognition, language and motor function) of survivors of twin-twin transfusion syndrome (TTTS). Method Observational cross-sectional study of a total of 67 monochorionic diamniotic twins who underwent fetoscopic laser coagulation (FLC) for treatment of TTTS. The study was conducted at the Center for Investigation in Pediatrics (CIPED), Universidade Estadual de Campinas. Ages ranged from one month and four days to two years four months. Bayley Scales of Infant and Toddler Development Screening Test-III, were used for evaluation. Results Most children reached the competent category and were classified as having appropriate performance. The preterm children scored worse than term infants for gross motor subtest (p = 0.036). Conclusion The majority of children reached the expected development according to their age. Despite the good neurodevelopment, children classified at risk should be monitored for development throughout childhood.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work investigated the cytotoxic and genotoxic potential of water from the River Paraíba do Sul (Brazil) using Allium cepa roots. An anatomo-morphological parameter (root length), mitotic indices, and frequency of micronuclei were analysed. Eight bulbs were chosen at random for treatment for 24 to 120 hours with the River water collected in the years of 2005 and 2006 from sites in the cities of Tremembé and Aparecida (São Paulo state, Brazil). Daily measurements of the length of the roots grown from each bulb were carried out throughout the experiment. Mitotic index (MI) and frequency of micronuclei (MN) were determined for 2000 cells per root, using 3-5 root tips from other bulbs (7-10). Only in the roots treated with samples of the River water collected in 2005 in Tremembé city was there a decrease in the root length growth compared to the respective control. However, a reduction in MI values was verified for both sites analysed for that year. Considering the data involving root length growth and especially MI values, a cytotoxic potential is suggested for the water of the River Paraíba do Sul at Tremembé and Aparecida, in the year of 2005. On the other hand, since in this year the MN frequency was not affected with the river water treatments, genotoxicity is not assumed for the river water sampled at the aforementioned places.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To compare intraocular pressure (IOP) rise in normal individuals and primary open-angle glaucoma patients and the safety and efficacy of ibopamine eye drops in different concentrations as a provocative test for glaucoma. METHODS: Glaucoma patients underwent (same eye) the ibopamine provocative test with two concentrations, 1% and 2%, in a random sequence at least 3 weeks apart, but not more than 3 months. The normal individuals were randomly submitted to one of the concentrations of ibopamine (1% and 2%). The test was considered positive if there was an IOP rise greater than 3 or 4 mmHg at 30 or 45 minutes to test which subset of the test has the best sensitivity (Se)/specificity (Sp). RESULTS: There was no statistically significant difference in any of the IOP measurements, comparing 1% with 2% ibopamine. The IOP was significantly higher at 30 and 45 minutes with both concentrations (p<0.001). The best sensitivity/specificity ratio was achieved with the cutoff point set as greater than 3 mmHg at 45 minutes with 2% ibopamine (area under the ROC curve: 0.864, Se: 84.6%; Sp:73.3%). All patients described a slight burning after ibopamine's instillation. CONCLUSION: 2% ibopamine is recommended as a provocative test for glaucoma. Because both concentrations have similar ability to rise IOP, 1% ibopamine may be used to treat ocular hypotony.