822 resultados para Toman, Justin
Resumo:
In light of the fact that several studies indicate that students can benefit from deeper understandings of the processes by which historical accounts are constructed, history educators have increasingly been focused on finding ways to teach students how to read and reason about events in the same manner as professional historians (Wineburg, 2001; Spoehr & Spoehr, 1994; Hynd, Holschuh, & Hubbard, 2004; Wiley & Voss, 1996). One possible resource for supporting this development may come out of emerging web-based technologies. New technologies and increased access to historical records and artifacts posted the Internet may be precisely the tools that can help students (Bass, Rosenzweig, & Mason, 1999). Given the right context, we believe it is possible to combine such resources and tools to create an environment for students that could strengthen their abilities to read and reason about historical events. Moreover, we believe that social media, specifically, microblogging (Nardi, Schiano, Gumbrecht, & Swartz, 2004) could play a key role.
Resumo:
The purpose of this paper is to present an approach for students to have non-traditional learning assessed for credit and introduce a tool that facilitates this process. The OCW Backpack system can connect self-learners with KNEXT assessment services to obtain college credit for prior learning. An ex post facto study based on historical data collected over the past two years at Kaplan University (KU) is presented to validate the portfolio assessment process. Cumulative GPA was compared for students who received experiential credit for learning derived from personal or professional experience with a matched sample of students with no experiential learning credits. The study found that students who received experiential credits perform better than the matched sample students on GPA. The findings validate the KU portfolio assessment process. Additionally, the results support the capability of the OCW Backpack to capture the critical information necessary to evaluate non-traditional learning for university credit.
Resumo:
Aquest projecte vol generar una aplicació per recollir les dades que els pacients crònics han d'anotar de forma periòdica com a forma de control. L'objectiu general del projecte en aquesta fase és que el pacient d'una malaltia crònica determinada pugui enregistrar les dades de control que s'agafen habitualment de forma autònoma amb un dispositiu mòbil intel·ligent.
Resumo:
The introduction of engineered nanostructured materials into a rapidly increasing number of industrial and consumer products will result in enhanced exposure to engineered nanoparticles. Workplace exposure has been identified as the most likely source of uncontrolled inhalation of engineered aerosolized nanoparticles, but release of engineered nanoparticles may occur at any stage of the lifecycle of (consumer) products. The dynamic development of nanomaterials with possibly unknown toxicological effects poses a challenge for the assessment of nanoparticle induced toxicity and safety.In this consensus document from a workshop on in-vitro cell systems for nanoparticle toxicity testing11Workshop on 'In-Vitro Exposure Studies for Toxicity Testing of Engineered Nanoparticles' sponsored by the Association for Aerosol Research (GAeF), 5-6 September 2009, Karlsruhe, Germany. an overview is given of the main issues concerning exposure to airborne nanoparticles, lung physiology, biological mechanisms of (adverse) action, in-vitro cell exposure systems, realistic tissue doses, risk assessment and social aspects of nanotechnology. The workshop participants recognized the large potential of in-vitro cell exposure systems for reliable, high-throughput screening of nanoparticle toxicity. For the investigation of lung toxicity, a strong preference was expressed for air-liquid interface (ALI) cell exposure systems (rather than submerged cell exposure systems) as they more closely resemble in-vivo conditions in the lungs and they allow for unaltered and dosimetrically accurate delivery of aerosolized nanoparticles to the cells. An important aspect, which is frequently overlooked, is the comparison of typically used in-vitro dose levels with realistic in-vivo nanoparticle doses in the lung. If we consider average ambient urban exposure and occupational exposure at 5mg/m3 (maximum level allowed by Occupational Safety and Health Administration (OSHA)) as the boundaries of human exposure, the corresponding upper-limit range of nanoparticle flux delivered to the lung tissue is 3×10-5-5×10-3μg/h/cm2 of lung tissue and 2-300particles/h/(epithelial) cell. This range can be easily matched and even exceeded by almost all currently available cell exposure systems.The consensus statement includes a set of recommendations for conducting in-vitro cell exposure studies with pulmonary cell systems and identifies urgent needs for future development. As these issues are crucial for the introduction of safe nanomaterials into the marketplace and the living environment, they deserve more attention and more interaction between biologists and aerosol scientists. The members of the workshop believe that further advances in in-vitro cell exposure studies would be greatly facilitated by a more active role of the aerosol scientists. The technical know-how for developing and running ALI in-vitro exposure systems is available in the aerosol community and at the same time biologists/toxicologists are required for proper assessment of the biological impact of nanoparticles.
Resumo:
¿Cómo se podría comprometer la Comunidad Internacional, en un mundo globalizado, para la resolución de conflictos?. En el siglo XXI, pasa necesariamente por un cuestionamiento de los métodos tradicionalmente empleados para la resolución de conflictos y la seguridad (frente a nuevos escenarios nuevas estrategias). Éstas toman forma en las doctrinas de la prevención, transformación, resolución de conflictos, gestión de crisis, y seguridad multidimensional/colectiva. Trasladándolo a Europa, la implantación de políticas comunes en las zonas en conflicto, urge cada día más. No existe una acción exterior colectiva ante el estallido de una crisis, porque al final siempre acaban prevaleciendo las decisiones de los Estados más poderosos. Es este mismo proceso decisional, anclado en las posturas realistas, el que bloquea o retarda todo intento de reacción común. Mientras, la violencia se sucede y asistimos impotentes a escenarios bélicos o escaladas, bajo la mirada atrapada de Occidente. La UE se enfrenta a un desafío cada vez más presente, por conseguir una acción globalizadora en materia de derechos humanos, porque frente a la globalización económica surge la necesidad de contrarestar sus efectos, globalizando también los derechos humanos. Cabría revisar las respuestas y capacidades europeas ante el estallido de una crisis.
Resumo:
In this paper we propose a new approach for tonic identification in Indian art music and present a proposal for acomplete iterative system for the same. Our method splits the task of tonic pitch identification into two stages. In the first stage, which is applicable to both vocal and instrumental music, we perform a multi-pitch analysis of the audio signal to identify the tonic pitch-class. Multi-pitch analysisallows us to take advantage of the drone sound, which constantlyreinforces the tonic. In the second stage we estimate the octave in which the tonic of the singer lies and is thusneeded only for the vocal performances. We analyse the predominant melody sung by the lead performer in order to establish the tonic octave. Both stages are individually evaluated on a sizable music collection and are shown toobtain a good accuracy. We also discuss the types of errors made by the method.Further, we present a proposal for a system that aims to incrementally utilize all the available data, both audio and metadata in order to identify the tonic pitch. It produces a tonic estimate and a confidence value, and is iterative in nature. At each iteration, more data is fed into the systemuntil the confidence value for the identified tonic is above a defined threshold. Rather than obtain high overall accuracy for our complete database, ultimately our goal is to develop a system which obtains very high accuracy on a subset of the database with maximum confidence.