940 resultados para NLP (Natural Language Processing)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Revisión didáctica subyacente a la impartición de lenguas extranjeras en el contexto sociopedagógico de las aulas, con el fin de que los alumnos puedan adquirir las destrezas comunicativas lingüísticas necesarias a nivel oral y escrito. Aplicado a seis grupos de primero de BUP de 35 alumnos cada grupo, del Instituto Tomás de Iriarte de Santa Cruz de Tenerife. La metodología llevada a cabo ha sido a través de un enfoque natural, siendo indispensable para ello, actitud favorable, orden creciente de dificultad, creación de reglas formales, mensajes significativos para el discente, etc. Los instrumentos de obtención de la información se llevaron a cabo a través de cuestionarios personales (intereses, nivel social, etc.) y observaciones de conocimientos y actitudes hacia la segunda lengua. La experiencia ha sido válida tanto para la formación profesional de los profesores como para el aprovechamiento académico de los alumnos. Valoración favorable del enfoque natural y necesidad de investigar en el área de adquisición de segundas lenguas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El presente trabajo pretendía investigar y analizar el funcionamiento del mercado de hidrocarburos y gas natural, en busca de determinar la Influencia de la exploración y el almacenamiento de petróleo y gas natural en la relación de las organizaciones con las comunidades. Teniendo en cuenta el concepto de comunidad a partir del marketing relacional donde la comunidad se refiere a los consumidores y el entorno en el cual están inmersos. En este contexto se definieron los principales actores que participan en la relación comercial, el tipo de relación presente entre ellos y todos los factores que intervienen en desarrollo de esta relación que cada vez es más inestable y de corto plazo. Al finalizar esta investigación se reunió información acerca de las relaciones comerciales en el mercado de hidrocarburos, que servirán de fundamento para investigaciones futuras que permitirán plantear alternativas para sobrellevar la incertidumbre de este mercado y de esa manera lograr desarrollar una relación más confiable y duradera entre las organizaciones y las comunidades que intervienen en el proceso comercial. Debido a que aunque existe gran diversidad estrategias que pueden ser implementadas para mantener una relación estable, estas en la mayor parte de los casos no son utilizadas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How do resource booms affect human capital accumulation? We exploit time and spatial variation generated by the commodity boom across local governments in Peru to measure the effect of natural resources on human capital formation. We explore the effect of both mining production and tax revenues on test scores, finding a substantial and statistically significant effect for the latter. Transfers to local governments from mining tax revenues are linked to an increase in math test scores of around 0.23 standard deviations. We find that the hiring of permanent teachers as well as the increases in parental employment and improvements in health outcomes of adults and children are plausible mechanisms for such large effect on learning. These findings suggest that redistributive policies could facilitate the accumulation of human capital in resource abundant developing countries as a way to avoid the natural resources curse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Responder cómo se procesa el lenguaje, cómo funcionan todos los elementos que intervienen en la comprensión y en qué orden se produce el procesamiento lingüístico. Alumnos de ESO, que no presentan discapacidad auditiva. El grupo experimental lo compone 31 chicos y 12 niñas que presentan dificultades en Lengua, algunos de ellos también tienen problemas de aprendizaje en Matemáticas y Lengua inglesa. Se realizan dos pruebas. La primera trata de comprensión oral. Reciben un cuadernillo cada uno. Disponen de 25 minutos. Los datos personales es lo último que deben escribir. Si no escuchan bien, lo indican en el cuadernillo y así se controla la falta de comprensión por deficiencias de sonido. Se les pone una grabación tres veces. Durante las grabaciones se controlan las diferencias acústicas entre los que están en la primera fila y la última. Los alumnos contestan a las preguntas. A los que presentan problemas con las definiciones se les pide que rellenen la última hoja para comprobar si conocen el significado, no su capacidad de expresión. El segundo cuadernillo lo reciben al acabar todo el grupo. Disponen de tiempo ilimitado. Si no conocen una palabra se les explica el significado. Finalmente se les pasa una prueba de memoria auditiva inmediata . Se pretende controlar la variable 'memoria' y estudiar su incidencia en la prueba. La segunda prueba consiste en originar un modelo de lenguaje utilizando el mismo texto presentado a los estudiantes. También se pretende conocer lo que pasa si se introducen oraciones incompletas para rellenar por los alumnos. La única información que dispone el ordenador es la señal vocal y con ella realiza el modelo de lenguaje. Grabadora mono portátil, cinta de casete, cuadernillo de respuesta de comprensión oral, cuadernillo de respuesta de estrategias de comprensión utilizada, cuaderno de respuestas de estrategias de procedimiento de comprensión, hoja de respuestas para la prueba de memoria, programa SPSS y Excel para análisis de datos. Para la segunda prueba los materiales son: la grabadora mono portátil Panasonic, cinta casete, reconocedor Via Voice 98, Pentium III, tarjeta de sonido, C.M.U. Statistical Language Modeling Tool Kit, Programa tex2wfreq, text2idngram, idngram21m,evallm. Para la primera prueba se confecciona un diseño experimental multivariado; las variables fueron: memoria, comprensión auditiva y estrategias utilizadas para comprender. Las variables contaminadoras: experimentador, material, condiciones acústicas, centro educativo, nivel socioeconómico y edad. Éstas se controlan por igualación. Las variables organísmicas y el sexo se controlan aleatoriamente. La memoria auditiva tuvo que ser controlada a través de un análisis de covarianza. En la segunda prueba, la variable fue la comprensión lingüística oral, para después establecer una comparación. Los resultados de la primera prueba revelan que las correlaciones que se obtienen entre las variables analizadas son independientes y arrojan diferencias entre el grupo experimental y el de control. Se encuentran puntuaciones más altas en los sujetos sin dificultades en memoria y comprensión. No hay diferencias entre los dos grupos en estrategias de comprensión. Los resultados obtenidos en la fase de evaluación de la segunda prueba indican que ninguna respuesta resulta elegida correctamente por lo que no se puede realizar ninguna comparación . Parece que la muestra utiliza el mismo modelo para comprender, todos utilizan las mismas estrategias, las diferencias son cuantitativas y debidas a variables organísmicas, entre ellas, la memoria. La falta de vocabulario es la primera dificultad en el grupo con dificultades, la falta de memoria impide corregir palabras mal pronunciadas, buscar conocimientos previos y relacionar ideas en su memoria a largo plazo. Son también incapaces de encontrar la idea principal. La comprensión es tan lenta que no pueden procesar. Se demuestra que los programas informáticos imitan al hombre a niveles elementales, en Tecnología del Habla se utilizan prioritariamente modelos semánticos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resumen tomado de la publicación

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resumen basado en el de la publicaci??n

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diffusion Tensor Imaging (DTI) is a new magnetic resonance imaging modality capable of producing quantitative maps of microscopic natural displacements of water molecules that occur in brain tissues as part of the physical diffusion process. This technique has become a powerful tool in the investigation of brain structure and function because it allows for in vivo measurements of white matter fiber orientation. The application of DTI in clinical practice requires specialized processing and visualization techniques to extract and represent acquired information in a comprehensible manner. Tracking techniques are used to infer patterns of continuity in the brain by following in a step-wise mode the path of a set of particles dropped into a vector field. In this way, white matter fiber maps can be obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study we evaluate processing costs of different types of anaphoric expressions during reading. We consider three types of anaphoric expressions in Subject sentential position: a null pronoun (pro), and two gaps produced by syntactic movement: a WHvariable and a NP copy. Given that coreferential pro exhibits more referential weight than wh- and NP-gaps, and grounded on theories of referential processing based on relations of hierarchy and accessibility of the antecedent, we raise the hypothesis that the more dependent on its antecedent the anaphoric null constituent is, and the more minimal is the distance in terms of hierarchical structure between the anaphoric null element and its antecedent, the lower are the cognitive costs in processing. To test our hypothesis, we registered the eye movements with R6-HS ASL system of 20 Portuguese adult native speakers. Text regions including the selected anaphoric expressions were delimited and tagged. We analyzed the reading time of each region taking into account the number and duration of eye fixations per region; we used the reading time by character in milliseconds in order to compare values between regions of different length. We found a significant advantage in the reading time of the gaps arising from movement over the reading time of pro.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper reports an interactive tool for calibrating a camera, suitable for use in outdoor scenes. The motivation for the tool was the need to obtain an approximate calibration for images taken with no explicit calibration data. Such images are frequently presented to research laboratories, especially in surveillance applications, with a request to demonstrate algorithms. The method decomposes the calibration parameters into intuitively simple components, and relies on the operator interactively adjusting the parameter settings to achieve a visually acceptable agreement between a rectilinear calibration model and his own perception of the scene. Using the tool, we have been able to calibrate images of unknown scenes, taken with unknown cameras, in a matter of minutes. The standard of calibration has proved to be sufficient for model-based pose recovery and tracking of vehicles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two experiments examined the learning of a set of Greek pronunciation rules through explicit and implicit modes of rule presentation. Experiment 1 compared the effectiveness of implicit and explicit modes of presentation in two modalities, visual and auditory. Subjects in the explicit or rule group were presented with the rule set, and those in the implicit or natural group were shown a set of Greek words, composed of letters from the rule set, linked to their pronunciations. Subjects learned the Greek words to criterion and were then given a series of tests which aimed to tap different types of knowledge. The results showed an advantage of explicit study of the rules. In addition, an interaction was found between mode of presentation and modality. Explicit instruction was more effective in the visual than in the auditory modality, whereas there was no modality effect for implicit instruction. Experiment 2 examined a possible reason for the advantage of the rule groups by comparing different combinations of explicit and implicit presentation in the study and learning phases. The results suggested that explicit presentation of the rules is only beneficial when it is followed by practice at applying them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consumers increasingly demand convenience foods of the highest quality in terms of natural flavor and taste, and which are freedom additives and preservatives. This demand has triggered the need for the development of a number of nonthermal approaches to food processing, of which high-pressure technology has proven to be very valuable. A number of recent publications have demonstrated novel and diverse uses of this technology. Its novel features, which include destruction of microorganisms at room temperature or lower, have made the technology commerically attractive. Enzymes forming bacteria can be by the application of pressure-thermal combinations. This review aims to identify the opportunities and challenges associated with this technology. In addition to discussing the effects of high pressure on food components, this review covers the combined effects of high pressure processing with: gamma irradiation, alternating current, ultrasound, and carbon dioxide or anti-microbial treatment. Further, the applications of this technology in various sectors-fruits and vegetables, dairy and meat processing-have been dealt with extensively. The integration of high-pressure with other matured processing operations such as blanching, dehydration, osmotic dehydration, rehyrdration, frying, freezing/thawing and solid-liquid extraction has been shown to open up new processing options. The key challenges identified include: heat transfer problems and resulting non-uniformity in processing, obtaining reliable and reproducible data, for process validation, lack of detailed knowledge about the interaction between high pressure, and a number of food constituents, packaging and statutory issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When wheat was grown under conditions of severe sulfate depletion, dramatic increases in the concentration of free asparagine were found in the grain of up to 30 times as compared to samples receiving the normal levels of sulfate fertilizer. The effect was observed both in plants grown in pots, where the levels of nutrients were carefully controlled, and in plants grown in field trials on soil with poor levels of natural nutrients where sulfate fertilizer was applied at levels from 0 to 40 kg sulfur/Ha. Many of the other free amino acids were present at higher levels in the sulfate-deprived wheat, but the levels of free glutamine showed increases similar to those observed for asparagine. In baked cereal products, asparagine is the precursor of the suspect carcinogen acrylamide, and when flours from the sulfate-deprived wheat were heated at 160 degrees C for 20 min, levels of acrylamide between 2600 and 5200 mu g/kg were found as compared to 600-900 mu g/kg in wheat grown with normal levels of sulfate fertilization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: There is an increasing interest in obtaining natural products with bioactive properties, using fermentation technology. However, the downstream processing consisting of multiple steps can be complicated, leading to increase in the final cost of the product. Therefore there is a need for integrated, cost-effective and scalable separation processes. RESULTS: The present study investigates the use of colloidal gas aphrons (CGA), which are surfactant-stabilized microbubbles, as a novel method for downstream processing. More particularly, their application for the recovery of astaxanthin from the cells of Phaffia rhodozyma is explored. Research carried out with standard solutions of astaxanthin and CGA generated from the cationic surfactant hexadecyl. trimethyl ammonium bromide (CTAB) showed that up to 90% recovery can be achieved under optimum conditions, i.e., pH 11 with NaOH 0.2 mol L-1. In the case of the cells' suspension from the fermentation broth, three different approaches were investigated: (a) the conventional integrated approach where CGA were applied directly; (b) CGA were applied to the clarified suspension of cells; and finally (c) the in situ approach, where CGA are generated within the clarified suspension of cells. Interestingly, in the case of the whole suspension (approach a) highest recoveries (78%) were achieved under the same conditions found to be optimal for the standard solutions. In addition, up to 97% recovery of total carotenoids could be achieved from the clarified suspension after pretreatment with NaOH. This pretreatment led to maximum cell disruption as well as optimum conditioning for subsequent CGA separation. CONCLUSIONS: These results demonstrate the potential of CGA for the recovery of bioactive components from complex feedstock. (c) 2008 Society of Chemical Industry.