825 resultados para The use of robots in education
Resumo:
REBIUN study on Science 2.0 and social web applications for research. There are three categories: share research, share resources and share results. Describes the applications and selected resources of interest: scientific social networks, scientific databases, research platforms, surveys, concept maps, file sharing, bibliographic management, social bookmarking, citation indexes, blogs and wikis, science news, open access. The services are evaluated and the report describes his interest to libraries.
Resumo:
This paper describes preliminary results of a qualitative case study on mobile communication conducted in an elders¿ retirement home in Toronto (Ontario, Canada) in May 2012. This is part of an international research project on the relationship between mobile communications and older people.Secondary data at a Canadian level contextualizes the case study. We focus ondemographic characteristics and on adoption and use of information and communication technologies (ICTs) broken by age.Participants in the study (21 individuals) are between 75 and 98 years of age, thereforewe can consider that the gathered evidence refers to the ¿old¿ older. Mobile phoneusers in the sample describe very specific uses of the mobile phone, while non-usersreport not facing external pressures for adopting that technology. The main channel formediated communication is the landline; in consequences mobile phones ¿when used¿ constitute an extra layer of communication. Finally, when members of the personal network of the individuals live abroad they are more prone to use Internet and Skype. We are also able to find ex-users of both mobile telephony and computers/internet who stopped using these technologies because they did not find any use for them.
Resumo:
This paper introduces a qualitative case study on mobile communication among the older population (60+ years old) conducted in Great Los Angeles (CA, USA) in autumn 2011. Methodology, fieldwork and preliminary results are discussed.Before, country-level data is presented to better understand the specific characteristics of the studied individuals. The section focus is on demographics and on acceptance and use of information and communication technologies (ICT).Preliminary results show that within the sample under study (20 individuals) there is a high number of mobile phone users (15) while among non-mobile users (5), three of them decide to stop using this technology. A majority of mobile phone adopters describe a very limited use of the device for everyday life communications. Finally,while Internet is really popular within the sample (14 users), just 3 individuals go online through their mobile telephone.
Resumo:
Toiminnanohjausjärjestelmien käyttö on muuttanut metsäteollisuuden kunnossapidon työntekijän toimenkuvaa kunnossapidon perustoiminnoista itseohjautuvampaan tietojen käsittelyyn ja jalostamiseen. Työn tavoitteena oliselvittää mobiilin työtilausjärjestelmän vaikutuksia metsäteollisuuden kunnossapidon kenttätyössä. Mobiilin työtilausjärjestelmän käyttö metsäteollisuuden kunnossapidossa mahdollistaa tärkeän kunnossapitotiedon keräämisen ja tarkentamisen kentällä, jossa se usein on ajanmukaisinta ja tarkinta. Haasteellisinta on oikeanlaisen teknologian löytäminen ja standardoiminen kunnossapidon toimintaympäristöihin. Myös organisaation toimintoprosessien on oltava selkeät ennen kuin prosesseja voidaan tehostaa teknologisin keinoin.
Resumo:
Isotope ratio mass spectrometry (IRMS) has recently made its appearance in the forensic community. This high-precision technology has already been applied to a broad range of forensic fields such as illicit drugs, explosives and flammable liquids, where current, routinely used techniques have limited powers of discrimination. The conclusions drawn from the majority of these IRMS studies appear to be very promising. Used in a comparative process, as in food or drug authentication, the measurement of stable isotope ratios is a new and remarkable analytical tool for the discrimination or the identification of a substance with a definite source or origin. However, the research consists mostly of preliminary studies. The significance of this 'new' piece of information needs to be evaluated in light of a forensic framework to assess the actual potential and validity of IRMS, considering the characteristics of each field. Through the isotopic study of black powder, this paper aims at illustrating the potential of the method and the limitations of current knowledge in stable isotopes when facing forensic problems.
Resumo:
Alkyl ketene dimers (AKD) are effective and highly hydrophobic sizing agents for the internal sizing of alkaline papers, but in some cases they may form deposits on paper machines and copiers. In addition, alkenyl succinic anhydrides (ASA)- based sizing agents are highly reactive, producing on-machine sizing, but under uncontrolled wet end conditions the hydrolysis of ASA may cause problems. This thesis aims at developing an improved ketene dimer based sizing agent that would have a lower deposit formation tendency on paper machines and copiers than a traditional type of AKD. The aim is also to improve the ink jet printability of a AKD sized paper. The sizing characteristics ofketene dimers have been compared to those of ASA. A lower tendency of ketene dimer deposit formation was shown in paper machine trials and in printability tests when branched fatty acids were used in the manufacture of a ketene dimer basedsizing agent. Fitting the melting and solidification temperature of a ketene dimer size to the process temperature of a paper machine or a copier contributes to machine cleanliness. A lower hydrophobicity of the paper sized with branched ketene dimer compared to the paper sized with traditional AKD was discovered. However, the ink jet print quality could be improved by the use of a branched ketene dimer. The branched ketene dimer helps in balancing the paper hydrophobicity for both black and color printing. The use of a high amount of protective colloidin the emulsification was considered to be useful for the sizing performance ofthe liquid type of sizing agents. Similar findings were indicated for both the branched ketene dimer and ASA.
Resumo:
Estimation of the dimensions of fluvial geobodies from core data is a notoriously difficult problem in reservoir modeling. To try and improve such estimates and, hence, reduce uncertainty in geomodels, data on dunes, unit bars, cross-bar channels, and compound bars and their associated deposits are presented herein from the sand-bed braided South Saskatchewan River, Canada. These data are used to test models that relate the scale of the formative bed forms to the dimensions of the preserved deposits and, therefore, provide an insight as to how such deposits may be preserved over geologic time. The preservation of bed-form geometry is quantified by comparing the Alluvial architecture above and below the maximum erosion depth of the modem channel deposits. This comparison shows that there is no significant difference in the mean set thickness of dune cross-strata above and below the basal erosion surface of the contemporary channel, thus suggesting that dimensional relationships between dune deposits and the formative bed-form dimensions are likely to be valid from both recent and older deposits. The data show that estimates of mean bankfull flow depth derived from dune, unit bar, and cross-bar channel deposits are all very similar. Thus, the use of all these metrics together can provide a useful check that all components and scales of the alluvial architecture have been identified correctly when building reservoir models. The data also highlight several practical issues with identifying and applying data relating to cross-strata. For example, the deposits of unit bars were found to be severely truncated in length and width, with only approximately 10% of the mean bar-form length remaining, and thus making identification in section difficult. For similar reasons, the deposits of compound bars were found to be especially difficult to recognize, and hence, estimates of channel depth based on this method may be problematic. Where only core data are available (i.e., no outcrop data exist), formative flow depths are suggested to be best reconstructed using cross-strata formed by dunes. However, theoretical relationships between the distribution of set thicknesses and formative dune height are found to result in slight overestimates of the latter and, hence, mean bankfull flow depths derived from these measurements. This article illustrates that the preservation of fluvial cross-strata and, thus, the paleohydraulic inferences that can be drawn from them, are a function of the ratio of the size and migration rate of bed forms and the time scale of aggradation and channel migration. These factors must thus be considered when deciding on appropriate length:thickness ratios for the purposes of object-based modeling in reservoir characterization.
Resumo:
Current diagnostic methods in differentiating septic from non-septic arthritis are time-consuming (culture) or have limited sensitivity (Gram stain). Microcalorimetry is a novel method that can rapidly detect microorganisms by their heat production. We investigated the accuracy and time to detection of septic arthritis by using microcalorimetry. Patients older than 18 years of age with acute arthritis of native joints were prospectively included. Synovial fluid was aspirated and investigated by Gram stain, culture and microcalorimetry. The diagnosis of septic arthritis and non-septic arthritis were made by experienced rheumatologists or orthopaedic surgeons. Septic arthritis was diagnosed by considering the finding of acute arthritis together with findings such as positive Gram stain or positive culture of synovial fluid or positive blood culture. The sensitivity and specificity for diagnosing septic arthritis and the time to positivity of microcalorimetry were determined. Of 90 patients (mean age 64 years), nine had septic arthritis, of whom four (44 %) had positive Gram stain, six (67 %) positive synovial fluid culture and four (44 %) had positive blood culture. The sensitivity of microcalorimetry was 89 %, the specificity was 99 % and the mean detection time was 5.0 h (range, 2.2-8.0 h). Microcalorimetry is an accurate and rapid method for the diagnosis of septic arthritis. It has potential to be used in clinical practice in diagnosing septic arthritis.
Resumo:
PURPOSE OF REVIEW: Multimodal monitoring (MMM) is routinely applied in neurointensive care. Unfortunately, there is no robust evidence on which MMM-derived physiologic variables are the most clinically relevant, how and when they should be monitored, and whether MMM impacts outcome. The complexity is even higher because once the data are continuously collected, interpretation and integration of these complex physiologic events into targeted individualized care is still embryonic. RECENT FINDINGS: Recent clinical investigation mainly focused on intracranial pressure, perfusion of the brain, and oxygen availability along with electrophysiology. Moreover, a series of articles reviewing the available evidence on all the MMM tools, giving practical recommendations for bedside MMM, has been published, along with other consensus documents on the role of neuromonitoring and electroencephalography in this setting. SUMMARY: MMM allows comprehensive exploration of the complex pathophysiology of acute brain damage and, depending on the different configuration of the pathological condition we are treating, the application of targeted individualized care. Unfortunately, we still lack robust evidence on how to better integrate MMM-derived information at the bedside to improve patient management. Advanced informatics is promising and may provide us a supportive tool to interpret physiologic events and guide pathophysiological-based therapeutic decisions.
Resumo:
This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on fi eld data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unifi cation of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call"plausibility"- including the fi delity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram"s 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents.
Resumo:
Biologic agents (also termed biologicals or biologics) are therapeutics that are synthesized by living organisms and directed against a specific determinant, for example, a cytokine or receptor. In inflammatory and autoimmune diseases, biologicals have revolutionized the treatment of several immune-mediated disorders. Biologicals have also been tested in allergic disorders. These include agents targeting IgE; T helper 2 (Th2)-type and Th2-promoting cytokines, including interleukin-4 (IL-4), IL-5, IL-9, IL-13, IL-31, and thymic stromal lymphopoietin (TSLP); pro-inflammatory cytokines, such as IL-1β, IL-12, IL-17A, IL-17F, IL-23, and tumor necrosis factor (TNF); chemokine receptor CCR4; and lymphocyte surface and adhesion molecules, including CD2, CD11a, CD20, CD25, CD52, and OX40 ligand. In this task force paper of the Interest Group on Biologicals of the European Academy of Allergy and Clinical Immunology, we review biologicals that are currently available or tested for the use in various allergic and urticarial pathologies, by providing an overview on their state of development, area of use, adverse events, and future research directions.
Resumo:
Educational institutions are considered a keystone for the establishment of a meritocratic society. They supposedly serve two functions: an educational function that promotes learning for all, and a selection function that sorts individuals into different programs, and ultimately social positions, based on individual merit. We study how the function of selection relates to support for assessment practices known to harm vs. benefit lower status students, through the perceived justice principles underlying these practices. We study two assessment practices: normative assessment-focused on ranking and social comparison, known to hinder the success of lower status students-and formative assessment-focused on learning and improvement, known to benefit lower status students. Normative assessment is usually perceived as relying on an equity principle, with rewards being allocated based on merit and should thus appear as positively associated with the function of selection. Formative assessment is usually perceived as relying on corrective justice that aims to ensure equality of outcomes by considering students' needs, which makes it less suitable for the function of selection. A questionnaire measuring these constructs was administered to university students. Results showed that believing that education is intended to select the best students positively predicts support for normative assessment, through increased perception of its reliance on equity, and negatively predicts support for formative assessment, through reduced perception of its ability to establish corrective justice. This study suggests that the belief in the function of selection as inherent to educational institutions can contribute to the reproduction of social inequalities by preventing change from assessment practices known to disadvantage lowerstatus student, namely normative assessment, to more favorable practices, namely formative assessment, and by promoting matching beliefs in justice principles.
Resumo:
Many therapies have been proposed for the management of temporomandibular disorders, including the use of different drugs. However, lack of knowledge about the mechanisms behind the pain associated with this pathology, and the fact that the studies carried out so far use highly disparate patient selection criteria, mean that results on the effectiveness of the different medications are inconclusive. This study makes a systematic review of the literature published on the use of tricyclic antidepressants for the treatment of temporomandibular disorders, using the SORT criteria (Strength of recommendation taxonomy) to consider the level of scientific evidence of the different studies. Following analysis of the articles, and in function of their scientific quality, a type B recommendation is given in favor of the use of tricyclic antidepressants for the treatment of temporomandibular disorders.
Resumo:
Nowadays, Species Distribution Models (SDMs) are a widely used tool. Using different statistical approaches these models reconstruct the realized niche of a species using presence data and a set of variables, often topoclimatic. There utilization range is quite large from understanding single species requirements, to the creation of nature reserve based on species hotspots, or modeling of climate change impact, etc... Most of the time these models are using variables at a resolution of 50km x 50km or 1 km x 1 km. However in some cases these models are used with resolutions below the kilometer scale and thus called high resolution models (100 m x 100 m or 25 m x 25 m). Quite recently a new kind of data has emerged enabling precision up to lm x lm and thus allowing very high resolution modeling. However these new variables are very costly and need an important amount of time to be processed. This is especially the case when these variables are used in complex calculation like models projections over large areas. Moreover the importance of very high resolution data in SDMs has not been assessed yet and is not well understood. Some basic knowledge on what drive species presence-absences is still missing. Indeed, it is not clear whether in mountain areas like the Alps coarse topoclimatic gradients are driving species distributions or if fine scale temperature or topography are more important or if their importance can be neglected when balance to competition or stochasticity. In this thesis I investigated the importance of very high resolution data (2-5m) in species distribution models using either very high resolution topographic, climatic or edaphic variables over a 2000m elevation gradient in the Western Swiss Alps. I also investigated more local responses of these variables for a subset of species living in this area at two precise elvation belts. During this thesis I showed that high resolution data necessitates very good datasets (species and variables for the models) to produce satisfactory results. Indeed, in mountain areas, temperature is the most important factor driving species distribution and needs to be modeled at very fine resolution instead of being interpolated over large surface to produce satisfactory results. Despite the instinctive idea that topographic should be very important at high resolution, results are mitigated. However looking at the importance of variables over a large gradient buffers the importance of the variables. Indeed topographic factors have been shown to be highly important at the subalpine level but their importance decrease at lower elevations. Wether at the mountane level edaphic and land use factors are more important high resolution topographic data is more imporatant at the subalpine level. Finally the biggest improvement in the models happens when edaphic variables are added. Indeed, adding soil variables is of high importance and variables like pH are overpassing the usual topographic variables in SDMs in term of importance in the models. To conclude high resolution is very important in modeling but necessitate very good datasets. Only increasing the resolution of the usual topoclimatic predictors is not sufficient and the use of edaphic predictors has been highlighted as fundamental to produce significantly better models. This is of primary importance, especially if these models are used to reconstruct communities or as basis for biodiversity assessments. -- Ces dernières années, l'utilisation des modèles de distribution d'espèces (SDMs) a continuellement augmenté. Ces modèles utilisent différents outils statistiques afin de reconstruire la niche réalisée d'une espèce à l'aide de variables, notamment climatiques ou topographiques, et de données de présence récoltées sur le terrain. Leur utilisation couvre de nombreux domaines allant de l'étude de l'écologie d'une espèce à la reconstruction de communautés ou à l'impact du réchauffement climatique. La plupart du temps, ces modèles utilisent des occur-rences issues des bases de données mondiales à une résolution plutôt large (1 km ou même 50 km). Certaines bases de données permettent cependant de travailler à haute résolution, par conséquent de descendre en dessous de l'échelle du kilomètre et de travailler avec des résolutions de 100 m x 100 m ou de 25 m x 25 m. Récemment, une nouvelle génération de données à très haute résolution est apparue et permet de travailler à l'échelle du mètre. Les variables qui peuvent être générées sur la base de ces nouvelles données sont cependant très coûteuses et nécessitent un temps conséquent quant à leur traitement. En effet, tout calcul statistique complexe, comme des projections de distribution d'espèces sur de larges surfaces, demande des calculateurs puissants et beaucoup de temps. De plus, les facteurs régissant la distribution des espèces à fine échelle sont encore mal connus et l'importance de variables à haute résolution comme la microtopographie ou la température dans les modèles n'est pas certaine. D'autres facteurs comme la compétition ou la stochasticité naturelle pourraient avoir une influence toute aussi forte. C'est dans ce contexte que se situe mon travail de thèse. J'ai cherché à comprendre l'importance de la haute résolution dans les modèles de distribution d'espèces, que ce soit pour la température, la microtopographie ou les variables édaphiques le long d'un important gradient d'altitude dans les Préalpes vaudoises. J'ai également cherché à comprendre l'impact local de certaines variables potentiellement négligées en raison d'effets confondants le long du gradient altitudinal. Durant cette thèse, j'ai pu monter que les variables à haute résolution, qu'elles soient liées à la température ou à la microtopographie, ne permettent qu'une amélioration substantielle des modèles. Afin de distinguer une amélioration conséquente, il est nécessaire de travailler avec des jeux de données plus importants, tant au niveau des espèces que des variables utilisées. Par exemple, les couches climatiques habituellement interpolées doivent être remplacées par des couches de température modélisées à haute résolution sur la base de données de terrain. Le fait de travailler le long d'un gradient de température de 2000m rend naturellement la température très importante au niveau des modèles. L'importance de la microtopographie est négligeable par rapport à la topographie à une résolution de 25m. Cependant, lorsque l'on regarde à une échelle plus locale, la haute résolution est une variable extrêmement importante dans le milieu subalpin. À l'étage montagnard par contre, les variables liées aux sols et à l'utilisation du sol sont très importantes. Finalement, les modèles de distribution d'espèces ont été particulièrement améliorés par l'addition de variables édaphiques, principalement le pH, dont l'importance supplante ou égale les variables topographique lors de leur ajout aux modèles de distribution d'espèces habituels.
Resumo:
A new family of distortion risk measures -GlueVaR- is proposed in Belles- Sampera et al. -2013- to procure a risk assessment lying between those provided by common quantile-based risk measures. GlueVaR risk measures may be expressed as a combination of these standard risk measures. We show here that this relationship may be used to obtain approximations of GlueVaR measures for general skewed distribution functions using the Cornish-Fisher expansion. A subfamily of GlueVaR measures satisfies the tail-subadditivity property. An example of risk measurement based on real insurance claim data is presented, where implications of tail-subadditivity in the aggregation of risks are illustrated.