999 resultados para intelligenza artificiale, test di Turing
Resumo:
We examined the correlation between results obtained from the in vivo Draize test for ocular irritation and in vitro results obtained from the sheep red blood cell (RBC) haemolytic assay, which assesses haemolysis and protein denaturation in erythrocytes, induced by cosmetic products. We sought to validate the haemolytic assay as a preliminary test for identifying highly-irritative products, and also to evaluate the in vitro test as alternative assay for replacement of the in vivo test. In vitro and in vivo analyses were carried out on 19 cosmetic products, in order to correlate the lesions in the ocular structures with three in vitro parameters: (i) the extent of haemolysis (H50); (ii) the protein denaturation index (131); and (iii) the H50/DI ratio, which reflects the irritation potential (IP). There was significant correlation between maximum average scores (MAS) and the parameters determined in vitro (r = 0.752-0.764). These results indicate that the RBC assay is a useful and rapid test for use as a screening method to assess the IP of cosmetic products, and for predicting the IP value with a high level of concordance (94.7%). The assay showed high sensitivity and specificity rates of 91.6% and 100%, respectively.
Resumo:
The present study was carried out to evaluate the Malar-CheckTM Pf test, an immunochromatographic assay that detects Plasmodium falciparum Histidine Rich Protein II, does not require equipment, and is easy and rapid to perform. In dilution assays performed to test sensitivity against known parasite density, Malar-CheckTMwere compared with thick blood smear (TBS), the gold standard for diagnosis. Palo Alto isolate or P. falciparum blood from patients with different parasitemias was used. The average cut-off points for each technique in three independent experiments were 12 and 71 parasites/mm³ (TBS and Malar-CheckTM, respectively). In the field assays, samples were collected from patients with fever who visited endemic regions. Compared to TBS, Malar-CheckTMyielded true-positive results in 38 patients, false-positive results in 3, true-negative results in 23, and false-negative result in 1. Malar-CheckTMperformed with samples from falciparum-infected patients after treatment showed persistence of antigen up to 30 days. Malar-CheckTM should aid the diagnosis of P. falciparum in remote areas and improve routine diagnosis even when microscopy is available. Previous P. falciparum infection, which can determine a false-positive test in cured individuals, should be considered. The prompt results obtained with the Malar-CheckTM for early diagnosis could avoid disease evolution to severe cases.
Resumo:
Planners in public and private institutions would like coherent forecasts of the components of age-specic mortality, such as causes of death. This has been di cult toachieve because the relative values of the forecast components often fail to behave ina way that is coherent with historical experience. In addition, when the group forecasts are combined the result is often incompatible with an all-groups forecast. It hasbeen shown that cause-specic mortality forecasts are pessimistic when compared withall-cause forecasts (Wilmoth, 1995). This paper abandons the conventional approachof using log mortality rates and forecasts the density of deaths in the life table. Sincethese values obey a unit sum constraint for both conventional single-decrement life tables (only one absorbing state) and multiple-decrement tables (more than one absorbingstate), they are intrinsically relative rather than absolute values across decrements aswell as ages. Using the methods of Compositional Data Analysis pioneered by Aitchison(1986), death densities are transformed into the real space so that the full range of multivariate statistics can be applied, then back-transformed to positive values so that theunit sum constraint is honoured. The structure of the best-known, single-decrementmortality-rate forecasting model, devised by Lee and Carter (1992), is expressed incompositional form and the results from the two models are compared. The compositional model is extended to a multiple-decrement form and used to forecast mortalityby cause of death for Japan
Resumo:
OBJECTIVE Serum levels of soluble TNF-like weak inducer of apoptosis (sTWEAK) and its scavenger receptor CD163 (sCD163) have been linked to insulin resistance. We analysed the usefulness of these cytokines as biomarkers of type 2 diabetes in a Spanish cohort, together with their relationship to food consumption in the setting of the Di@bet.es study. RESEARCH DESIGN AND METHODS This is a cross-sectional, matched case-control study of 514 type 2 diabetes subjects and 517 controls with a Normal Oral Glucose Tolerance Test (NOGTT), using data from the Di@bet.es study. Study variables included clinical and demographic structured survey, food frequency questionnaire and physical examination. Serum concentrations of sTWEAK and sCD163 were measured by ELISA. Linear regression analysis determined which variables were related to sTWEAK and sCD163 levels. Logistic regression analysis was used to estimate odd ratios of presenting type 2 diabetes. RESULTS sCD163 concentrations and sCD163/sTWEAK ratio were 11.0% and 15.0% higher, respectively, (P<0.001) in type 2 diabetes than in controls. Following adjustment for various confounders, the OR for presenting type 2 diabetes in subjects in the highest vs the lowest tertile of sCD163 was [(OR), 2,01 (95%CI, 1,46-2,97); P for trend <0.001]. Coffee and red wine consumption was negatively associated with serum levels of sCD163 (P = 0.0001 and; P = 0.002 for coffee and red wine intake, respectively). CONCLUSIONS High circulating levels of sCD163 are associated with type 2 diabetes in the Spanish population. The association between coffee and red wine intake and these biomarkers deserves further study to confirm its potential role in type 2 diabetes.
Resumo:
RESUME Dès le printemps 2004, la construction d'une 2ème ligne de métro est entreprise dans la ville de Lausanne en Suisse. En reliant Ouchy, au bord du lac Léman (alt. 373 m) à Epalinges (alt. 711 m), le nouveau métro "M2" traversera dès 2008 l'agglomération lausannoise du Sud au Nord sur une distance de 6 km. Depuis l'avant-projet, en 1999, une grande quantité de données géologiques a été récolté et de nombreux forages exécutés sur le site. Ceci nous a donné une occasion unique d'entreprendre une étude de microgravimétrique urbaine de détail. Le mode de creusement du tunnel dépend fortement des matériaux à excaver et il est classiquement du domaine du géologue, avec ses connaissances de la géologie régionale et de la stratigraphie des forages, de fournir à l'ingénieur un modèle géologique. Ce modèle indiquera dans ce cas l'épaisseur des terrains meubles qui recouvrent le soubassement rocheux. La représentativité spatiale d'une information très localisée, comme celle d'un forage, est d'autant plus compliquée que le détail recherché est petit. C'est à ce moment là que la prospection géophysique, plus spécialement gravimétrique, peut apporter des informations complémentaires déterminantes pour régionaliser les données ponctuelles des forages. La microgravimétrie en milieu urbain implique de corriger avec soin les perturbations gravifiques sur la mesure de la pesanteur dues aux effets de la topographie, des bâtiments et des caves afin d'isoler l'effet gravifique dû exclusivement à l'épaisseur du remplissage des terrains meubles. Tenant compte de l'intensité des corrections topographiques en milieu urbain, nous avons donné une grande importance aux sous-sols, leurs effets gravifiques pouvant atteindre l'ordre du dixième de mGal. Nous avons donc intégré ces corrections celle de topographie et traité les effets des bâtiments de manière indépendante. Nous avons inclus dans le modèle numérique de terrain (MNT) la chaussée et les sous-sols afin de construire un modèle numérique de terrain urbain. Nous utiliserons un nouvel acronyme « MNTU »pour décrire ce modèle. Nous proposons d'établir des cartes de corrections topographiques préalables, basées sur les données à disposition fournies par le cadastre en faisant des hypothèses sur la profondeur des sous-sols et la hauteur des bâtiments. Les deux zones de test choisies sont caractéristiques des différents types d'urbanisation présente à Lausanne et se révèlent par conséquent très intéressantes pour élaborer une méthodologie globale de la microgravimétrie urbaine. Le but était d'évaluer l'épaisseur du remplissage morainique sur un fond rocheux molassique se situant à une profondeur variable de quelques mètres à une trentaine de mètres et d'en établir une coupe dans l'axe du futur tracé du métro. Les résultats des modélisations se sont révélés très convaincants en détectant des zones qui diffèrent sensiblement du modèle géologique d'avant projet. Nous avons également démontré que l'application de cette méthode géophysique, non destructive, est à même de limiter le nombre de sondages mécaniques lors de l'avant-projet et du projet définitif, ce qui peut limiter à la fois les coûts et le dérangement engendré par ces travaux de surface. L'adaptabilité de la technique gravimétrique permet d'intervenir dans toutes les différentes phases d'un projet de génie civil comme celui de la construction d'un métro en souterrain. KURZFASSUNG Seit dem Frühling 2004 ist in der Stadt Lausanne (Schweiz) die neue U-Bahn "M2" in Konstruktion. Diese soll auf 6 km Länge die Lausanner Agglomeration von Süd nach Nord durchqueren. Die dem Projekt zu Grunde liegende technische Planung sieht vor, daß die Bahnlinie hauptsächlich in der Molasse angesiedelt sein wird. Seit dem Vorentwurf (1999) ist eine große Anzahl geologischer Angaben gesammelt worden. Daraus ergab sich die einmalige Gelegenheit, die Informationen aus den damit verbundenen zahlreichen Bohrungen zu einer detaillierten mikrogravimetrischen Studie der Stadt Lausanne zu erweitern und zu vervollständigen. Das Ziel bestand darin, die Mächtigkeit der die Molasseüberdeckenden Moräneablagerung abzuschätzen, um eine entsprechendes geologisches Profile entlang der künftigen Bahnlinie zu erstellen. Weiterhin sollte gezeigt werden, daß die Anwendung dieser nicht-invasiven geophysikalischen Methode es ermöglicht, die Anzahl der benötigten Bohrungen sowohl in der Pilotphase wie auch im endgültigen Projekt zu reduzieren, was zu wesentlichen finanziellen Einsparungen in der Ausführung des Werkes beitragen würde. Die beiden in dieser Studie bearbeiteten Testzonen befinden sich im Nordteil und im Stadtzentrum von Lausanne und sind durch eine unterschiedliche Urbanisierung charakterisiert. Das anstehende Gestein liegt in verschiedenen Tiefen: von einigen Metern bis zu etwa dreißig Metern. Diese Zonen weisen alle Schwierigkeiten einer urbanen Bebauung mit hoher Verkehrsdichte auf und waren daher massgebend bei der Ausarbeitung einer globalen mikrogravimetrischen Methodologie für die Stadt Lausanne. Die so entwickelte Technik ermöglicht, die störenden Auswirkungen der Topographie, der Gebäude, der Keller und der Öffentlichen Infrastrukturen sorgfältig zu korrigieren, um so die ausschließlich auf die Mächtigkeit des Lockergesteins zurückzuführenden Effekte zu isolieren. In Bezug auf die Intensität der Auswirkungen der topographischen Korrekturen im Stadtgebiet wurde den Untergeschossen eine besonders grosse Bedeutung zugemessen da die entsprechenden Schwerkrafteffekte eine Grösse von rund einem Zehntel mGal erreichen können. Wir schlagen deshalb vor, vorläufige Karten der topographischen Korrekturen zu erstellen. Diese Korrekturen basieren auf den uns vom Katasterplan gelieferten Daten und einigen Hypothesen bezüglich der Tiefe der Untergeschosse und der Höhe der Gebäude. Die Verfügbarkeit einer derartigen Karte vor der eigentlichen gravimetrischen Messkampagne würde uns erlauben, die Position der Meßstationen besser zu wählen. Wir sahen zudem, daß ein entsprechenden a priori Filter benutzt werden kann, wenn die Form und die Intensität der Anomalie offensichtlich dem entsprechenden Gebäude zugeordnet werden können. Diese Strategie muß jedoch mit Vorsicht angewandt werden, denn falls weitere Anomalien dazukommen, können bedeutende Verschiebungen durch Übèrlagerungen der Schwerewirkung verschiedener Strukturen entstehen. Die Ergebnisse der Modellierung haben sich als sehr überzeugend erwiesen, da sie im Voraus unbekannte sensible Zonen korrekt identifiziert haben. Die Anwendbarkeit der in dieser Arbeit entwickelten gravimetrischen Technik ermöglicht es, während allen Phasen eines Grossbauprojekts, wie zum Beispiel bei der Konstruktion einer unterirdischen U-Bahn, einzugreifen. ABSTRACT Since Spring of 2004 a new metro line has been under construction in the city of Lausanne in Switzerland. The new line, the M2, will be 6 km long and will traverse the city from south to north. The civil engineering project determined that the line would be located primarily in the Molasse. Since the preparatory project in 1999, a great quantity of geological data has been collected, and the many drillings made on the site have proved to be a unique opportunity to undertake a study of urban microgravimetry. The goal was to evaluate the thickness of the morainic filling over the molassic bedrock, and to establish a section along the axis of the future line. It then had to be shown that the application of this nondestructive geophysical method could reduce the number of mechanical surveys required both for a preparatory and a definitive project, which would lead to real savings in the realization of a civil engineering project. The two test zones chosen, one in the northern part of the city and one in the city centre, are characterised by various types of urbanisation. Bedrock is at a depth varying from a few metres to about thirty metres. These zones well exemplify the various difficulties encountered in an urban environment and are therefore very interesting for the development of an overall methodology of urban microgravimetry. Microgravimetry in an urban environment requires careful corrections for gravific disturbances due to the effects of topography, buildings, cellars, and the infrastructure of distribution networks, in order to isolate the gravific effect due exclusively to the thickness of loose soil filling. Bearing in mind the intensity of the topographic corrections in an urban environment, we gave particular importance to basements. Their gravific effects can reach the order of one tenth of one meal, and can influence above all the precision of the Bouguer anomaly. We propose to establish preliminary topographic correction charts based on data provided to us by the land register, by making assumptions on the depths of basements and the heights of buildings. Availability of this chart previous to a gravimetry campaign would enable us to choose optimum measuring sites. We have also seen that an a priori filter can be used when the form and the intensity of the anomaly correspond visually to the corresponding building. This strategy must be used with caution because if other anomalies are to be associated, important shifts can be generated by the superposition of the effects of different structures. The results of the model have proved to be very convincing in detecting previously unknown sensitive zones. The adaptability of the gravimetry technique allows for application in all phases of a civil engineering project such as the construction of an underground metro line. RIASSUNTO Dalla primavera 2004 una nuova linea metropolitana é in costruzione nella città di Losanna in Svizzera. La nuova metropolitana "M2" traverserà per la lunghezza di 6 km il centro urbano di Losanna da sud a nord. II progetto d'ingegneria civile prevedeva un tracciato situato essenzialmente nel fondo roccioso arenaceo terziario (molassa). Dalla redazione del progetto preliminare, avvenuta nel 1999, una grande quantità di dati geologici sono stati raccolti e sono stati eseguiti numerosi sondaggi. Questo sì é presentato come un'occasione unica per mettere a punto uno studio microgravimetrico in ambiente urbano con lo scopo di valutare lo spessore dei terreni sciolti di origine glaciale che ricoprono il fondo roccioso di molassa e di mettere in evidenza come l'applicazione di questo metodo geofisico non distruttivo possa limitare il numero di sondaggi meccanici nella fase di progetto preliminare ed esecutivo con conseguente reale risparmio economico nella realizzazione di una tale opera. Le due zone di test sono situate una nella zona nord e la seconda nel centro storico di Losanna e sono caratterizzate da stili architettonici differenti. II fondo roccioso é situato ad una profondità variabile da qualche metro ad una trentina. Queste due zone sembrano ben rappresentare tutte le difficoltà di un ambiente urbano e ben si prestano per elaborare una metodologia globale per la microgravimetria in ambiente urbano. L'applicazione di questa tecnica nell'ambiente suddetto implica la correzione attenta delle perturbazioni sulla misura dell'accelerazione gravitazionale, causate dalla topografia, gli edifici, le cantine e le infrastrutture dei sottoservizi, per ben isolare il segnale esclusivamente causato dallo spessore dei terreni sciolti. Tenuto conto, dell'intensità delle correzioni topografiche, abbiamo dato grande importanza alle cantine, poiché il loro effetto sulle misure può raggiungere il decimo di mGal. Proponiamo quindi di redigere una carta delle correzioni topografiche preliminare all'acquisizione, facendo delle ipotesi sulla profondità delle cantine e sull'altezza degli edifici, sulla base delle planimetrie catastali. L'analisi di questa carta permetterà di scegliere le posizioni più adatte per le stazioni gravimetriche. Abbiamo anche osservato che un filtro a priori, qualora la forma e l'intensità dell'anomalia fosse facilmente riconducibile in maniera visuale ad un edificio, possa essere efficace. Tuttavia questa strategia deve essere utilizzata con precauzione, poiché può introdurre uno scarto, qualora più anomalie, dovute a differenti strutture, si sovrappongano. I risultati delle modellizzazioni si sono rivelati convincenti, evidenziando zone sensibili non conosciute preventivamente. L'adattabilità della tecnica gravimetrica ha mostrato di poter intervenire in differenti fasi di un progetto di ingegneria civile, quale è quella di un'opera in sotterraneo.
Resumo:
BACKGROUND: Prognosis prediction for resected primary colon cancer is based on the T-stage Node Metastasis (TNM) staging system. We investigated if four well-documented gene expression risk scores can improve patient stratification. METHODS: Microarray-based versions of risk-scores were applied to a large independent cohort of 688 stage II/III tumors from the PETACC-3 trial. Prognostic value for relapse-free survival (RFS), survival after relapse (SAR), and overall survival (OS) was assessed by regression analysis. To assess improvement over a reference, prognostic model was assessed with the area under curve (AUC) of receiver operating characteristic (ROC) curves. All statistical tests were two-sided, except the AUC increase. RESULTS: All four risk scores (RSs) showed a statistically significant association (single-test, P < .0167) with OS or RFS in univariate models, but with HRs below 1.38 per interquartile range. Three scores were predictors of shorter RFS, one of shorter SAR. Each RS could only marginally improve an RFS or OS model with the known factors T-stage, N-stage, and microsatellite instability (MSI) status (AUC gains < 0.025 units). The pairwise interscore discordance was never high (maximal Spearman correlation = 0.563) A combined score showed a trend to higher prognostic value and higher AUC increase for OS (HR = 1.74, 95% confidence interval [CI] = 1.44 to 2.10, P < .001, AUC from 0.6918 to 0.7321) and RFS (HR = 1.56, 95% CI = 1.33 to 1.84, P < .001, AUC from 0.6723 to 0.6945) than any single score. CONCLUSIONS: The four tested gene expression-based risk scores provide prognostic information but contribute only marginally to improving models based on established risk factors. A combination of the risk scores might provide more robust information. Predictors of RFS and SAR might need to be different.
Resumo:
Introduction: One of the main goals for exereise testing in children is evaluation of exercise capacity. There are many testing protocols, but the Bruce treadmill protocol is widely used among pediatrie cardiology centers. Thirty years ago, Cuming et al. were the first to establish normal values for children from North America (Canada) aged 4 to 18 years old. No data was ever published for children from Western Europe. Our study aimed to assess the validity of the normal values from Cuming et al. for children from Western Europe in the 21 st century. Methods: It is a retrospective cohort study in a tertiary care children's hospital. 144 children referred to our institution but finally diagnosed as having a normal heart underwent exercise stress testing using the Bruce protocol between 1999 and 2006. Data from 59 girls and 85 boys aged 6 to 18 were reviewed. Mean endurance time (ET) for each age category and gender was compared with the mean normal values fram Cumming et al by an unpaired t-test. Results: Mean ET increases with age until 15 years old in girls and then decreases. Mean endurance time increases continuouslY'from 6 to 18 years old in boys. The increase is more pronounced in boys than girls. In our study, a significant higher mean ET was found for boys in age categories 10 to 12, 13 to 15 and 16 to 18. No significant difference was found in any other groups. Conclusions: Some normal values from Cuming et al. established in 1978 for ET with the Bruce protocol are probably not appropriate any more today for children from Western Europe. Our study showed that mean ET is higher for boys from 10 to 18 years old. Despite common beliefs, cardiovascular conditioning doesn't seem yet reduced in children from Western Europe. New data for Bruce treadmill exercise. testing for healthy children, 4 to 18 years old, living in Western Europe are required. .
Resumo:
Planners in public and private institutions would like coherent forecasts of the components of age-specic mortality, such as causes of death. This has been di cult to achieve because the relative values of the forecast components often fail to behave in a way that is coherent with historical experience. In addition, when the group forecasts are combined the result is often incompatible with an all-groups forecast. It has been shown that cause-specic mortality forecasts are pessimistic when compared with all-cause forecasts (Wilmoth, 1995). This paper abandons the conventional approach of using log mortality rates and forecasts the density of deaths in the life table. Since these values obey a unit sum constraint for both conventional single-decrement life tables (only one absorbing state) and multiple-decrement tables (more than one absorbing state), they are intrinsically relative rather than absolute values across decrements as well as ages. Using the methods of Compositional Data Analysis pioneered by Aitchison (1986), death densities are transformed into the real space so that the full range of multivariate statistics can be applied, then back-transformed to positive values so that the unit sum constraint is honoured. The structure of the best-known, single-decrement mortality-rate forecasting model, devised by Lee and Carter (1992), is expressed in compositional form and the results from the two models are compared. The compositional model is extended to a multiple-decrement form and used to forecast mortality by cause of death for Japan
Resumo:
1) Realizar un trabajo experimental con sujetos normales y deficientes de Sevilla, Córdoba y Las Palmas de Gran Canaria 2) Averiguar la correlación existente entre el test de la figura humana de F.L. Goodenough y D.B Harris y las calificaciones escolares de media general de la muestra escogida 3) Ofrecer a dichas zonas una estandarización del test de la figura humana de Goodenough y Harris, y de este modo hacer extensible a la población española un nuevo elemento más en la diagnosis de la inteligencia. 610 sujetos de escuelas normales en edades comprendidas entre los 6 y 10 años y 162 deficientes mentales de 5 a 21 años escolarizados en centros específicos, todos ellos de Sevilla y Córdoba, 500 niños de escuelas normales de Las Palmas de Gran Canaria de edades comprendidas entre 6 y 10 años y 50 sujetos de centros específicos de 9 años de edad. Analiza las bases teóricas del test de Goodenough y Harris, relaciona los estudios españoles en torno al test de Goodenough y plantea algunas hipótesis para comprobarlas con una investigación que se realiza en dos muestras con niños normales y niños deficientes mentales en las provincias españolas de Sevilla, Córdoba y Las Palmas de Gran Canaria. Test del dibujo de la figura humana de Goodenough-Harris. El test de Goodneough y Harris ha sido aplicado según las normas indicadas en el manual del Test della figura umana (ed. Organizzazioni Speciali, Firenze) en cuanto a motivar a los niños en la realización del dibujo del hombre y de la mujer y siguiendo las pautas descriptivas. Terminado el dibujo del hombre se les presentaba otra hoja para que dibujasen una mujer y concluido éste, se les presentaba un cuestionario para que lo respondiesen teniendo delante la figura del hombre y de la mujer. La aplicación del test se realizó en las primeras horas de clase, bien de la mañana o de la tarde, para evitar momentos en que los alumnos se encontrasen cansados y dicho cansancio pudiese influir en los resultados del test. Se procuró durante la aplicación del test un ambiente de relax y simpatía entre testista y alumnos y se invalidaron todos aquellos protocolos que no cumplían las normas exigidas por el test, los cuales fueron 10 en total. 1) La obtención de una correlación positiva y significativa entre el dibujo del hombre y de la mujer en ambas muestras de sujetos normales, demuestran la fidelidad del test. 2) La correlación del test tanto en el dibujo del hombre como en el de la mujer, con las notas escolares de media del curso anterior, en las áreas de ciencias naturales y dibujo, son todas ellas positivas y en su mayoría significativas, lo que significa una efectiva correlación entre el test y el rendimiento escolar. 3) el resultado de las diversas correlaciones positivas y significativas entre las notas de media del curso anterior y de estas con el test, pueden ofrecer información del tipo de inteligencia que mide el test, pudiendo predecir el éxito escolar en las áreas tratadas más que en otras materias. Sin olvidar que la predicción debe ser estudiada y profundizada. 4) No existen diferencias significativas en los resultados del test entre una edad y la inmediatamente superior, pero en general, existe diferencia significativa cuando se trata de dos años de diferencia. 5) Las diferencias entre la muestra Sevilla y Córdoba con Las Palmas de Gran Canaria son significativas, a favor de los niños andaluces. 6) En la muestra de sujetos deficientes mentales existe entre el dibujo del hombre y de la mujer una correlación positiva y significativa en todos los casos, salvo en las niñas de la muestra de Las Palmas, que aunque positiva no es significativa. Por lo cual sólo se puede hablar de una moderada correlación entre los dos dibujos. 7) Nada certero se puede afirmar respecto a las diferencias significativas de medias tanto al comparar los resultados obtenidos por los deficientes mentales niños y niñas de la muestra Sevilla y Córdoba, como al compararlos con la muestra de Las Palmas. El pequeño número de sujetos, el grado diferente de deficiencia dentro de las diversas edades no controladas harían equívocas e infundadas cualquier afirmación. 1) Entre las técnicas y los tests que se emplean con niños, el test de la figura humana es significativo, interesante y agradable, ya que a la mayoría de los sujetos les encanta dibujar y pintar. Además, tanto los sujetos normales como los deficientes dibujan, en numerosas ocasiones, figuras humanas. La representación del dibujo de la persona humana en sus diversas etapas es un reflejo del desarrollo intelectual del niño, puesto que el niño no pinta lo que ve sino lo que sabe. Por consiguiente, los resultado con el uso del test utilizado han sido satisfactorios tanto en normales como en deficientes.
Resumo:
Trabajo que recoge los resultados tras la aplicación del test de Raven en diferentes grupos de EGB para conocer de manera objetiva la realidad de los alumnos españoles de este nivel educativo. Todo ello va precedido por unas normas simplificadas de aplicación, acompañadas de las que dió el autor del test, y un estudio sintético y práctico de las variadas y diversas potencialidades del mismo encaminadas al conocimiento exhaustivo del examinando, así como diversas observaciones prácticas del autor..
Resumo:
Chatterbox Challenge is an annual web-based contest for artificial conversational systems, ACE. The 2010 instantiation was the tenth consecutive contest held between March and June in the 60th year following the publication of Alan Turing’s influential disquisition ‘computing machinery and intelligence’. Loosely based on Turing’s viva voca interrogator-hidden witness imitation game, a thought experiment to ascertain a machine’s capacity to respond satisfactorily to unrestricted questions, the contest provides a platform for technology comparison and evaluation. This paper provides an insight into emotion content in the entries since the 2005 Chatterbox Challenge. The authors find that synthetic textual systems, none of which are backed by academic or industry funding, are, on the whole and more than half a century since Weizenbaum’s natural language understanding experiment, little further than Eliza in terms of expressing emotion in dialogue. This may be a failure on the part of the academic AI community for ignoring the Turing test as an engineering challenge.
Resumo:
Purpose – The purpose of this paper is to consider Turing's two tests for machine intelligence: the parallel-paired, three-participants game presented in his 1950 paper, and the “jury-service” one-to-one measure described two years later in a radio broadcast. Both versions were instantiated in practical Turing tests during the 18th Loebner Prize for artificial intelligence hosted at the University of Reading, UK, in October 2008. This involved jury-service tests in the preliminary phase and parallel-paired in the final phase. Design/methodology/approach – Almost 100 test results from the final have been evaluated and this paper reports some intriguing nuances which arose as a result of the unique contest. Findings – In the 2008 competition, Turing's 30 per cent pass rate is not achieved by any machine in the parallel-paired tests but Turing's modified prediction: “at least in a hundred years time” is remembered. Originality/value – The paper presents actual responses from “modern Elizas” to human interrogators during contest dialogues that show considerable improvement in artificial conversational entities (ACE). Unlike their ancestor – Weizenbaum's natural language understanding system – ACE are now able to recall, share information and disclose personal interests.
Resumo:
A series of imitation games involving 3-participant (simultaneous comparison of two hidden entities) and 2-participant (direct interrogation of a hidden entity) were conducted at Bletchley Park on the 100th anniversary of Alan Turing’s birth: 23 June 2012. From the ongoing analysis of over 150 games involving (expert and non-expert, males and females, adults and child) judges, machines and hidden humans (foils for the machines), we present six particular conversations that took place between human judges and a hidden entity that produced unexpected results. From this sample we focus on features of Turing’s machine intelligence test that the mathematician/code breaker did not consider in his examination for machine thinking: the subjective nature of attributing intelligence to another mind.