997 resultados para Implicit methods
Resumo:
In the past years, an important volume of research in Natural Language Processing has concentrated on the development of automatic systems to deal with affect in text. The different approaches considered dealt mostly with explicit expressions of emotion, at word level. Nevertheless, expressions of emotion are often implicit, inferrable from situations that have an affective meaning. Dealing with this phenomenon requires automatic systems to have “knowledge” on the situation, and the concepts it describes and their interaction, to be able to “judge” it, in the same manner as a person would. This necessity motivated us to develop the EmotiNet knowledge base — a resource for the detection of emotion from text based on commonsense knowledge on concepts, their interaction and their affective consequence. In this article, we briefly present the process undergone to build EmotiNet and subsequently propose methods to extend the knowledge it contains. We further on analyse the performance of implicit affect detection using this resource. We compare the results obtained with EmotiNet to the use of alternative methods for affect detection. Following the evaluations, we conclude that the structure and content of EmotiNet are appropriate to address the automatic treatment of implicitly expressed affect, that the knowledge it contains can be easily extended and that overall, methods employing EmotiNet obtain better results than traditional emotion detection approaches.
Resumo:
While it is generally agreed that perception can occur without awareness, there continues to be debate about the type of representational content that is accessible when awareness is minimized or eliminated. Most investigations that have addressed this issue evaluate access to well-learned representations. Far fewer studies have evaluated whether or not associations encountered just once prior to testing might also be accessed and influence behavior. Here, eye movements were used to examine whether or not memory for studied relationships is evident following the presentation of subliminal cues. Participants (assigned to experimental or control groups) studied scene-face pairs and test trials evaluated implicit and explicit memory for these pairs. Each test trial began with a subliminal scene cue, followed by three visible studied faces. For experimental group participants, one face was the studied associate of the scene (implicit test); for controls none were a match. Subsequently, the Display containing a match was presented to both groups, but now it was preceded by a visible scene cue (explicit test). Eye movements were recorded and recognition Memory responses were made. Participants in the experimental group looked disproportionately at matching faces on implicit test trials and participants from both groups looked disproportionately at matching faces on explicit test trials, even when that face had not been successfully identified as the associate. Critically, implicit memory-based viewing effects seemed not to depend on residual awareness of subliminal scenes cues, as subjective and objective measures indicated that scenes were successfully masked from view. The reported outcomes indicate that memory for studied relationships can be expressed in eye movement behavior without awareness.
The effective use of implicit parallelism through the use of an object-oriented programming language
Resumo:
This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.
Resumo:
The aims of this study were to investigate the beliefs concerning the philosophy of science held by practising science teachers and to relate those beliefs to their pupils' understanding of the philosophy of science. Three philosophies of science, differing in the way they relate experimental work to other parts of the scientific enterprise, are described. By the use of questionnaire techniques, teachers of four extreme types were identified. These are: the H type or hypothetico-deductivist teacher, who sees experiments as potential falsifiers of hypotheses or of logical deductions from them; the I type or inductivist teacher, who regards experiments mainly as a way of increasing the range of observations available for recording before patterns are noted and inductive generalisation is carried out; the V type or verificationist teacher, who expects experiments to provide proof and to demonstrate the truth or accuracy of scientific statements; and the 0 type, who has no discernible philosophical beliefs about the nature of science or its methodology. Following interviews of selected teachers to check their responses to the questionnaire and to determine their normal teaching methods, an experiment was organised in which parallel groups were given H, I and V type teaching in the normal school situation during most of one academic year. Using pre-test and post-test scores on a specially developed test of pupil understanding of the philosophy of science, it was shown that pupils were positively affected by their teacher's implied philosophy of science. There was also some indication that V type teaching improved marks obtained in school science examinations, but appeared to discourage the more able from continuing the study of science. Effects were also noted on vocabulary used by pupils to describe scientists and their activities.
Resumo:
The merits of various numerical methods for the solution of the one and two dimensional heat conduction equation with a radiation boundary condition have been examined from a practical standpoint in order to determine accuracies and efficiencies. It is found that the use of five increments to approximate the space derivatives gives sufficiently accurate results provided the time step is not too large; further, the implicit backward difference method of Liebmann (27) is found to be the most accurate method. On this basis, a new implicit method is proposed for the solution of the three-dimensional heat conduction equation with radiation boundary conditions. The accuracies of the integral and analogue computer methods are also investigated.
Resumo:
Objective: Images on food and dietary supplement packaging might lead people to infer (appropriately or inappropriately) certain health benefits of those products. Research on this issue largely involves direct questions, which could (a) elicit inferences that would not be made unprompted, and (b) fail to capture inferences made implicitly. Using a novel memory-based method, in the present research, we explored whether packaging imagery elicits health inferences without prompting, and the extent to which these inferences are made implicitly. Method: In 3 experiments, participants saw fictional product packages accompanied by written claims. Some packages contained an image that implied a health-related function (e.g., a brain), and some contained no image. Participants studied these packages and claims, and subsequently their memory for seen and unseen claims were tested. Results: When a health image was featured on a package, participants often subsequently recognized health claims that—despite being implied by the image—were not truly presented. In Experiment 2, these recognition errors persisted despite an explicit warning against treating the images as informative. In Experiment 3, these findings were replicated in a large consumer sample from 5 European countries, and with a cued-recall test. Conclusion: These findings confirm that images can act as health claims, by leading people to infer health benefits without prompting. These inferences appear often to be implicit, and could therefore be highly pervasive. The data underscore the importance of regulating imagery on product packaging; memory-based methods represent innovative ways to measure how leading (or misleading) specific images can be. (PsycINFO Database Record (c) 2016 APA, all rights reserved)
Resumo:
The purpose of this research was to compare the delivery methods as practiced by higher education faculty teaching distance courses with recommended or emerging standard instructional delivery methods for distance education. Previous research shows that traditional-type instructional strategies have been used in distance education and that there has been no training to distance teach. Secondary data, however, appear to suggest emerging practices which could be pooled toward the development of standards. This is a qualitative study based on the constant comparative analysis approach of grounded theory.^ Participants (N = 5) of this study were full-time faculty teaching distance education courses. The observation method used was unobtrusive content analysis of videotaped instruction. Triangulation of data was accomplished through one-on-one in-depth interviews and from literature review. Due to the addition of non-media content being analyzed, a special time-sampling technique was designed by the researcher--influenced by content analyst theories of media-related data--to sample portions of the videotape instruction that were observed and counted. A standardized interview guide was used to collect data from in-depth interviews. Coding was done based on categories drawn from review of literature, and from Cranton and Weston's (1989) typology of instructional strategies. The data were observed, counted, tabulated, analyzed, and interpreted solely by the researcher. It should be noted however, that systematic and rigorous data collection and analysis led to credible data.^ The findings of this study supported the proposition that there are no standard instructional practices for distance teaching. Further, the findings revealed that of the emerging practices suggested by proponents and by faculty who teach distance education courses, few were practiced even minimally. A noted example was the use of lecture and questioning. Questioning, as a teaching tool was used a great deal, with students at the originating site but not with distance students. Lectures were given, but were mostly conducted in traditional fashion--long in duration and with no interactive component.^ It can be concluded from the findings that while there are no standard practices for instructional delivery for distance education, there appears to be sufficient information from secondary and empirical data to initiate some standard instructional practices. Therefore, grounded in this research data is the theory that the way to arrive at some instructional delivery standards for televised distance education is a pooling of the tacitly agreed-upon emerging practices by proponents and practicing instructors. Implicit in this theory is a need for experimental research so that these emerging practices can be tested, tried, and proven, ultimately resulting in formal standards for instructional delivery in television education. ^
Resumo:
We survey articles covering how hedge fund returns are explained, using largely non-linear multifactor models that examine the non-linear pay-offs and exposures of hedge funds. We provide an integrated view of the implicit factor and statistical factor models that are largely able to explain the hedge fund return-generating process. We present their evolution through time by discussing pioneering studies that made a significant contribution to knowledge, and also recent innovative studies that examine hedge fund exposures using advanced econometric methods. This is the first review that analyzes very recent studies that explain a large part of hedge fund variation. We conclude by presenting some gaps for future research.
Resumo:
The aim of this clinical study was to determine the efficacy of Uncaria tomentosa (cat's claw) against denture stomatitis (DS). Fifty patients with DS were randomly assigned into 3 groups to receive 2% miconazole, placebo, or 2% U tomentosa gel. DS level was recorded immediately, after 1 week of treatment, and 1 week after treatment. The clinical effectiveness of each treatment was measured using Newton's criteria. Mycologic samples from palatal mucosa and prosthesis were obtained to determinate colony forming units per milliliter (CFU/mL) and fungal identification at each evaluation period. Candida species were identified with HiCrome Candida and API 20C AUX biochemical test. DS severity decreased in all groups (P < .05). A significant reduction in number of CFU/mL after 1 week (P < .05) was observed for all groups and remained after 14 days (P > .05). C albicans was the most prevalent microorganism before treatment, followed by C tropicalis, C glabrata, and C krusei, regardless of the group and time evaluated. U tomentosa gel had the same effect as 2% miconazole gel. U tomentosa gel is an effective topical adjuvant treatment for denture stomatitis.
Resumo:
Negative-ion mode electrospray ionization, ESI(-), with Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) was coupled to a Partial Least Squares (PLS) regression and variable selection methods to estimate the total acid number (TAN) of Brazilian crude oil samples. Generally, ESI(-)-FT-ICR mass spectra present a power of resolution of ca. 500,000 and a mass accuracy less than 1 ppm, producing a data matrix containing over 5700 variables per sample. These variables correspond to heteroatom-containing species detected as deprotonated molecules, [M - H](-) ions, which are identified primarily as naphthenic acids, phenols and carbazole analog species. The TAN values for all samples ranged from 0.06 to 3.61 mg of KOH g(-1). To facilitate the spectral interpretation, three methods of variable selection were studied: variable importance in the projection (VIP), interval partial least squares (iPLS) and elimination of uninformative variables (UVE). The UVE method seems to be more appropriate for selecting important variables, reducing the dimension of the variables to 183 and producing a root mean square error of prediction of 0.32 mg of KOH g(-1). By reducing the size of the data, it was possible to relate the selected variables with their corresponding molecular formulas, thus identifying the main chemical species responsible for the TAN values.
Resumo:
What is the contribution of the provision, at no cost for users, of long acting reversible contraceptive methods (LARC; copper intrauterine device [IUD], the levonorgestrel-releasing intrauterine system [LNG-IUS], contraceptive implants and depot-medroxyprogesterone [DMPA] injection) towards the disability-adjusted life years (DALY) averted through a Brazilian university-based clinic established over 30 years ago. Over the last 10 years of evaluation, provision of LARC methods and DMPA by the clinic are estimated to have contributed to DALY averted by between 37 and 60 maternal deaths, 315-424 child mortalities, 634-853 combined maternal morbidity and mortality and child mortality, and 1056-1412 unsafe abortions averted. LARC methods are associated with a high contraceptive effectiveness when compared with contraceptive methods which need frequent attention; perhaps because LARC methods are independent of individual or couple compliance. However, in general previous studies have evaluated contraceptive methods during clinical studies over a short period of time, or not more than 10 years. Furthermore, information regarding the estimation of the DALY averted is scarce. We reviewed 50 004 medical charts from women who consulted for the first time looking for a contraceptive method over the period from 2 January 1980 through 31 December 2012. Women who consulted at the Department of Obstetrics and Gynaecology, University of Campinas, Brazil were new users and users switching contraceptive, including the copper IUD (n = 13 826), the LNG-IUS (n = 1525), implants (n = 277) and DMPA (n = 9387). Estimation of the DALY averted included maternal morbidity and mortality, child mortality and unsafe abortions averted. We obtained 29 416 contraceptive segments of use including 25 009 contraceptive segments of use from 20 821 new users or switchers to any LARC method or DMPA with at least 1 year of follow-up. The mean (± SD) age of the women at first consultation ranged from 25.3 ± 5.7 (range 12-47) years in the 1980s, to 31.9 ± 7.4 (range 16-50) years in 2010-2011. The most common contraceptive chosen at the first consultation was copper IUD (48.3, 74.5 and 64.7% in the 1980s, 1990s and 2000s, respectively). For an evaluation over 20 years, the cumulative pregnancy rates (SEM) were 0.4 (0.2), 2.8 (2.1), 4.0 (0.4) and 1.3 (0.4) for the LNG-IUS, the implants, copper IUD and DMPA, respectively and cumulative continuation rates (SEM) were 15.1 (3.7), 3.9 (1.4), 14.1 (0.6) and 7.3 (1.7) for the LNG-IUS, implants, copper IUD and DMPA, respectively (P < 0.001). Over the last 10 years of evaluation, the estimation of the contribution of the clinic through the provision of LARC methods and DMPA to DALY averted was 37-60 maternal deaths; between 315 and 424 child mortalities; combined maternal morbidity and mortality and child mortality of between 634 and 853, and 1056-1412 unsafe abortions averted. The main limitations are the number of women who never returned to the clinic (overall 14% among the four methods under evaluation); consequently the pregnancy rate could be different. Other limitations include the analysis of two kinds of copper IUD and two kinds of contraceptive implants as the same IUD or implant, and the low number of users of implants. In addition, the DALY calculation relies on a number of estimates, which may vary in different parts of the world. LARC methods and DMPA are highly effective and women who were well-counselled used these methods for a long time. The benefit of averting maternal morbidity and mortality, child mortality, and unsafe abortions is an example to health policy makers to implement more family planning programmes and to offer contraceptive methods, mainly LARC and DMPA, at no cost or at affordable cost for the underprivileged population. This study received partial financial support from the Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP), grant # 2012/12810-4 and from the National Research Council (CNPq), grant #573747/2008-3. B.F.B., M.P.G., and V.M.C. were fellows from the scientific initiation programme from FAPESP. Since the year 2001, all the TCu380A IUD were donated by Injeflex, São Paulo, Brazil, and from the year 2006 all the LNG-IUS were donated by the International Contraceptive Access Foundation (ICA), Turku, Finland. Both donations are as unrestricted grants. The authors declare that there are no conflicts of interest associated with this study.
Resumo:
The microabrasion technique of enamel consists of selectively abrading the discolored areas or causing superficial structural changes in a selective way. In microabrasion technique, abrasive products associated with acids are used, and the evaluation of enamel roughness after this treatment, as well as surface polishing, is necessary. This in-vitro study evaluated the enamel roughness after microabrasion, followed by different polishing techniques. Roughness analyses were performed before microabrasion (L1), after microabrasion (L2), and after polishing (L3).Thus, 60 bovine incisive teeth divided into two groups were selected (n=30): G1- 37% phosphoric acid (37%) (Dentsply) and pumice; G2- hydrochloric acid (6.6%) associated with silicon carbide (Opalustre - Ultradent). Thereafter, the groups were divided into three sub-groups (n=10), according to the system of polishing: A - Fine and superfine granulation aluminum oxide discs (SofLex 3M); B - Diamond Paste (FGM) associated with felt discs (FGM); C - Silicone tips (Enhance - Dentsply). A PROC MIXED procedure was applied after data exploratory analysis, as well as the Tukey-Kramer test (5%). No statistical differences were found between G1 and G2 groups. L2 differed statistically from L1 and showed superior amounts of roughness. Differences in the amounts of post-polishing roughness for specific groups (1A, 2B, and 1C) arose, which demonstrated less roughness in L3 and differed statistically from L2 in the polishing system. All products increased enamel roughness, and the effectiveness of the polishing systems was dependent upon the abrasive used.
Resumo:
Silk fibroin has been widely explored for many biomedical applications, due to its biocompatibility and biodegradability. Sterilization is a fundamental step in biomaterials processing and it must not jeopardize the functionality of medical devices. The aim of this study was to analyze the influence of different sterilization methods in the physical, chemical, and biological characteristics of dense and porous silk fibroin membranes. Silk fibroin membranes were treated by several procedures: immersion in 70% ethanol solution, ultraviolet radiation, autoclave, ethylene oxide, and gamma radiation, and were analyzed by scanning electron microscopy, Fourier-transformed infrared spectroscopy (FTIR), X-ray diffraction, tensile strength and in vitro cytotoxicity to Chinese hamster ovary cells. The results indicated that the sterilization methods did not cause perceivable morphological changes in the membranes and the membranes were not toxic to cells. The sterilization methods that used organic solvent or an increased humidity and/or temperature (70% ethanol, autoclave, and ethylene oxide) increased the silk II content in the membranes: the dense membranes became more brittle, while the porous membranes showed increased strength at break. Membranes that underwent sterilization by UV and gamma radiation presented properties similar to the nonsterilized membranes, mainly for tensile strength and FTIR results.
Resumo:
Frankfurters are widely consumed all over the world, and the production requires a wide range of meat and non-meat ingredients. Due to these characteristics, frankfurters are products that can be easily adulterated with lower value meats, and the presence of undeclared species. Adulterations are often still difficult to detect, due the fact that the adulterant components are usually very similar to the authentic product. In this work, FT-Raman spectroscopy was employed as a rapid technique for assessing the quality of frankfurters. Based on information provided by the Raman spectra, a multivariate classification model was developed to identify the frankfurter type. The aim was to study three types of frankfurters (chicken, turkey and mixed meat) according to their Raman spectra, based on the fatty vibrational bands. Classification model was built using partial least square discriminant analysis (PLS-DA) and the performance model was evaluated in terms of sensitivity, specificity, accuracy, efficiency and Matthews's correlation coefficient. The PLS-DA models give sensitivity and specificity values on the test set in the ranges of 88%-100%, showing good performance of the classification models. The work shows the Raman spectroscopy with chemometric tools can be used as an analytical tool in quality control of frankfurters.
Resumo:
The aim of the present study was to compare four methods of fixation in mandibular body fractures. Mechanical and photoelastic tests were performed using polyurethane and photoelastic resin mandibles, respectively. The study groups contained the following: (I), two miniplates of 2.0 mm; (II) one 2.0 mm plate and an Erich arch bar; (III) one 2.4 mm plate and an Erich arch bar, and (IV) one 2.0 mm plate and one 2.4 mm plate. The differences between the mean values were analyzed using Tukey's test, the Mann-Whitney test and the Bonferroni correction. Group II recorded the lowest resistance, followed by groups I, IV and III. The photoelastic test confirmed the increase of tension in group II. The 2.4 mm system board in linear mandibular body fractures provided more resistance and the use of only one 2.0 mm plate in the central area of the mandible created higher tension.