936 resultados para Presentation Format
Resumo:
Dissertação de mestrado em Psicologia Aplicada
Resumo:
Whereas much literature has documented difficulties in making probabilistic inferences, it hasalso emphasized the importance of task characteristics in determining judgmental accuracy.Noting that people exhibit remarkable efficiency in encoding frequency information sequentially,we construct tasks that exploit this ability by requiring people to experience the outcomes ofsequentially simulated data. We report two experiments. The first involved seven well-knownprobabilistic inference tasks. Participants differed in statistical sophistication and answered withand without experience obtained through sequentially simulated outcomes in a design thatpermitted both between- and within-subject analyses. The second experiment involvedinterpreting the outcomes of a regression analysis when making inferences for investmentdecisions. In both experiments, even the statistically naïve make accurate probabilistic inferencesafter experiencing sequentially simulated outcomes and many prefer this presentation format. Weconclude by discussing theoretical and practical implications.
Resumo:
During the past decades testing has matured from ad-hoc activity into being an integral part of the development process. The benefits of testing are obvious for modern communication systems, which operate in heterogeneous environments amongst devices from various manufacturers. The increased demand for testing also creates demand for tools and technologies that support and automate testing activities. This thesis discusses applicability of visualization techniques in the result analysis part of the testing process. Particularly, the primary focus of this work is visualization of test execution logs produced by a TTCN-3 test system. TTCN-3 is an internationally standardized test specification and implementation language. The TTCN-3 standard suite includes specification of a test logging interface and a graphical presentation format, but no immediate relationship between them. This thesis presents a technique for mapping the log events to the graphical presentation format along with a concrete implementation, which is integrated with the Eclipse Platform and the OpenTTCN Tester toolchain. Results of this work indicate that for majority of the log events, a visual representation may be derived from the TTCN-3 standard suite. The remaining events were analysed and three categories relevant in either log analysis or implementation of the visualization tool were identified: events indicating insertion of something into the incoming queue of a port, events indicating a mismatch and events describing the control flow during the execution. Applicability of the results is limited into the domain of TTCN-3, but the developed mapping and the implementation may be utilized with any TTCN-3 tool that is able to produce the execution log in the standardized XML format.
Resumo:
This research attempted to address the question of the role of explicit algorithms and episodic contexts in the acquisition of computational procedures for regrouping in subtraction. Three groups of students having difficulty learning to subtract with regrouping were taught procedures for doing so through either an explicit algorithm, an episodic content or an examples approach. It was hypothesized that the use of an explicit algorithm represented in a flow chart format would facilitate the acquisition and retention of specific procedural steps relative to the other two conditions. On the other hand, the use of paragraph stories to create episodic content was expected to facilitate the retrieval of algorithms, particularly in a mixed presentation format. The subjects were tested on similar, near, and far transfer questions over a four-day period. Near and far transfer algorithms were also introduced on Day Two. The results suggested that both explicit and episodic context facilitate performance on questions requiring subtraction with regrouping. However, the differential effects of these two approaches on near and far transfer questions were not as easy to identify. Explicit algorithms may facilitate the acquisition of specific procedural steps while at the same time inhibiting the application of such steps to transfer questions. Similarly, the value of episodic context in cuing the retrieval of an algorithm may be limited by the ability of a subject to identify and classify a new question as an exemplar of a particular episodically deflned problem type or category. The implications of these findings in relation to the procedures employed in the teaching of Mathematics to students with learning problems are discussed in detail.
Resumo:
The Virtual Lightbox for Museums and Archives (VLMA) is a tool for collecting and reusing, in a structured fashion, the online contents of museums and archive datasets. It is not restricted to datasets with visual components although VLMA includes a lightbox service that enables comparison and manipulation of visual information. With VLMA, one can browse and search collections, construct personal collections, annotate them, export these collections to XML or Impress (Open Office) presentation format, and share collections with other VLMA users. VLMA was piloted as an e-Learning tool as part of JISC’s e-Learning focus in its first phase (2004-2005) and in its second phase (2005-2006) it has incorporated new partner collections while improving and expanding interfaces and services. This paper concerns its development as a research and teaching tool, especially to teachers using museum collections, and discusses the recent development of VLMA.
Resumo:
BACKGROUND Meta-analyses of continuous outcomes typically provide enough information for decision-makers to evaluate the extent to which chance can explain apparent differences between interventions. The interpretation of the magnitude of these differences - from trivial to large - can, however, be challenging. We investigated clinicians' understanding and perceptions of usefulness of 6 statistical formats for presenting continuous outcomes from meta-analyses (standardized mean difference, minimal important difference units, mean difference in natural units, ratio of means, relative risk and risk difference). METHODS We invited 610 staff and trainees in internal medicine and family medicine programs in 8 countries to participate. Paper-based, self-administered questionnaires presented summary estimates of hypothetical interventions versus placebo for chronic pain. The estimates showed either a small or a large effect for each of the 6 statistical formats for presenting continuous outcomes. Questions addressed participants' understanding of the magnitude of treatment effects and their perception of the usefulness of the presentation format. We randomly assigned participants 1 of 4 versions of the questionnaire, each with a different effect size (large or small) and presentation order for the 6 formats (1 to 6, or 6 to 1). RESULTS Overall, 531 (87.0%) of the clinicians responded. Respondents best understood risk difference, followed by relative risk and ratio of means. Similarly, they perceived the dichotomous presentation of continuous outcomes (relative risk and risk difference) to be most useful. Presenting results as a standardized mean difference, the longest standing and most widely used approach, was poorly understood and perceived as least useful. INTERPRETATION None of the presentation formats were well understood or perceived as extremely useful. Clinicians best understood the dichotomous presentations of continuous outcomes and perceived them to be the most useful. Further initiatives to help clinicians better grasp the magnitude of the treatment effect are needed.
Resumo:
This paper reports on the survey of the characteristic features of national input-output tables compiled by the member countries of the Asian International Input-Output Table project. In making any inter-regional tables, the presentation format of each constituent table has to be carefully studied in order to design a common adjustment rule. The survey was conducted in the period of 2003-04, with invaluable cooperation from each collaborating institution of the project. Some analytical findings are drawn from the survey results, such as the similarity between each national table and the Japanese table, the responsiveness to the 1993 SNA, and the major areas of conflict regarding the presentation format.
Resumo:
The Universidad Politécnica of Madrid (UPM) includes schools and faculties that were for engineering degrees, architecture and computer science, that are now in a quick EEES Bolonia Plan metamorphosis getting into degrees, masters and doctorate structures. They are focused towards action in machines, constructions, enterprises, that are subjected to machines, human and environment created risks. These are present in actions such as use loads, wind, snow, waves, flows, earthquakes, forces and effects in machines, vehicles behavior, chemical effects, and other environmental factors including effects of crops, cattle and beasts, forests, and varied essential economic and social disturbances. Emphasis is for authors in this session more about risks of natural origin, such as for hail, winds, snow or waves that are not exactly known a priori, but that are often considered with statistical expected distributions giving extreme values for convenient return periods. These distributions are known from measures in time, statistic of extremes and models about hazard scenarios and about responses of man made constructions or devices. In each engineering field theories were built about hazards scenarios and how to cover for important risks. Engineers must get that the systems they handle, such as vehicles, machines, firms or agro lands or forests, obtain production with enough safety for persons and with decent economic results in spite of risks. For that risks must be considered in planning, in realization and in operation, and safety margins must be taken but at a reasonable cost. That is a small level of risks will often remain, due to limitations in costs or because of due to strange hazards, and maybe they will be covered by insurance in cases such as in transport with cars, ships or aircrafts, in agro for hail, or for fire in houses or in forests. These and other decisions about quality, security for men or about business financial risks are sometimes considered with Decision Theories models, using often tools from Statistics or operational Research. The authors have done and are following field surveys about risk consideration in the careers in UPM, making deep analysis of curricula taking into account the new structures of degrees in the EEES Bolonia Plan, and they have considered the risk structures offered by diverse schools of Decision theories. That gives an aspect of the needs and uses, and recommendations about improving in the teaching about risk, that may include special subjects especially oriented for each career, school or faculty, so as to be recommended to be included into the curricula, including an elaboration and presentation format using a multi-criteria decision model.
Resumo:
Given the growing number of wrongful convictions involving faulty eyewitness evidence and the strong reliance by jurors on eyewitness testimony, researchers have sought to develop safeguards to decrease erroneous identifications. While decades of eyewitness research have led to numerous recommendations for the collection of eyewitness evidence, less is known regarding the psychological processes that govern identification responses. The purpose of the current research was to expand the theoretical knowledge of eyewitness identification decisions by exploring two separate memory theories: signal detection theory and dual-process theory. This was accomplished by examining both system and estimator variables in the context of a novel lineup recognition paradigm. Both theories were also examined in conjunction with confidence to determine whether it might add significantly to the understanding of eyewitness memory. ^ In two separate experiments, both an encoding and a retrieval-based manipulation were chosen to examine the application of theory to eyewitness identification decisions. Dual-process estimates were measured through the use of remember-know judgments (Gardiner & Richardson-Klavehn, 2000). In Experiment 1, the effects of divided attention and lineup presentation format (simultaneous vs. sequential) were examined. In Experiment 2, perceptual distance and lineup response deadline were examined. Overall, the results indicated that discrimination and remember judgments (recollection) were generally affected by variations in encoding quality and response criterion and know judgments (familiarity) were generally affected by variations in retrieval options. Specifically, as encoding quality improved, discrimination ability and judgments of recollection increased; and as the retrieval task became more difficult there was a shift toward lenient choosing and more reliance on familiarity. ^ The application of signal detection theory and dual-process theory in the current experiments produced predictable results on both system and estimator variables. These theories were also compared to measures of general confidence, calibration, and diagnosticity. The application of the additional confidence measures in conjunction with signal detection theory and dual-process theory gave a more in-depth explanation than either theory alone. Therefore, the general conclusion is that eyewitness identifications can be understood in a more complete manor by applying theory and examining confidence. Future directions and policy implications are discussed. ^
Resumo:
This study examined the interaction of age, attitude, and performance within the context of an interactive computer testing experience. Subjects were 13 males and 47 females between the ages of 55 and 82, with a minimum of a high school education. Initial attitudes toward computers, as measured by the Cybernetics Attitude Scale (CAS), demonstrated overall equivalence between these older subjects and previously tested younger subjects. Post-intervention scores on the CAS indicated that attitudes toward computers were unaffected by either a "fun" or a "challenging" computer interaction experience. The differential effects of a computerized vs. a paperand- pencil presentation format of a 20-item, multiple choice vocabulary test were examined. Results indicated no significant differences in the performance of subjects in the two conditions, and no interaction effect between attitude and performance. These findings suggest that the attitudes of older adults towards computers do not affect their computerized testing performance, at least for short term testing of verbal abilities. A further implication is that, under the conditions presented here, older subjects appear to be unaffected by mode of testing. The impact of recent advanced in technology on older adults is discussed.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
Research literature and regulators are unconditional in pointing the disclosure of operating cash flow through direct method a section of unique information. Besides the intuitive facet, it is also consistent in forecasting future operating cash flows and a cohesive piece to financial statement puzzle. Bearing this in mind, I produce an analysis on the usefulness and predictive ability on the disclosure of gross cash receipts and payments over the disclosure of reconciliation between net income and accruals for two markets with special features, Portugal and Spain. Results validate the usefulness of direct method format in predicting future operating cash flow. Key
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Task overview, requirements, format and protocols plus guidance and mark sheet
Resumo:
Due to multiple immune evasion mechanisms of cancer cells, novel therapy approaches are required to overcome the limitations of existing immunotherapies. Bispecific antibodies are potent anti-cancer drugs, which redirect effector T cells for specific tumor cell lysis, thus enabling the patient’s immune system to fight cancer cells. The antibody format used in this proof of concept study–bispecific ideal monoclonal antibodies termed BiMAB–is a tailor-made recombinant protein, which consists of two fused scFv antibodies recognizing different antigens. Both are arranged in tandem on a single peptide chain and the individual variable binding domains are separated by special non-immunogenic linkers. The format is comprised of a scFv targeting CLDN18.2–a gastric cancer tumor associated antigen (TAA) –while the second specificity binds the CD3 epsilon (CD3ε) subunit of the T cell receptor (TCR) on T cells. For the first time, we compared in our IMAB362-based BiMAB setting, four different anti-CD3-scFvs, respectively derived from the mAbs TR66, CLB-T3, as well as the humanized and the murine variant of UCHT1. In addition, we investigated the impact of an N- versus a C-terminal location of the IMAB362-derived scFv and the anti-CD3-scFvs. Thus, nine CLDN18.2 specific BiMAB proteins were generated, of which all showed a remarkably high cytotoxicity towards CLDN18.2-positive tumor cells. Because of its promising effectiveness, 1BiMAB emerged as the BiMAB prototype. The selectivity of 1BiMAB for its TAA and CD3ε, with affinities in the nanomolar range, has been confirmed by in vitro assays. Its dual binding depends on the design of an N-terminally positioned IMAB362 scFv and the consecutive C-terminally positioned TR66 scFv. 1BiMAB provoked a concentration and target cell dependent T cell activation, proliferation, and upregulation of the cytolytic protein Granzyme B, as well as the consequent elimination of target cells. Our results demonstrate that 1BiMAB is able to activate T cells independent of elements that are usually involved in the T cell recognition program, like antigen presentation, MHC restriction, and co-stimulatory effector molecules. In the first in vivo studies using a subcutaneous xenogeneic tumor mouse model in immune incompetent NSG mice, we could prove a significant therapeutic effect of 1BiMAB with partial or complete tumor elimination. The initial in vitro RIBOMAB experiments correspondingly showed encouraging results. The electroporation of 1BiMAB IVT-RNA into target or effector cells was feasible, while the functionality of translated 1BiMAB was proven by induced T cell activation and target cell lysis. Accordingly, we could show that the in vitro RIBOMAB approach was applicable for all nine BiMABs, which proves the RIBOMAB concept. Thus, the CLDN18.2-BiMAB strategy offers great potential for the treatment of cancer. In the future, administered either as protein or as IVT-RNA, the BiMAB format will contribute towards finding solutions to raise and sustain tumor-specific cellular responses elicited by engaged and activated endogenous T cells. This will potentially enable us to overcome immune evasion mechanisms of tumor cells, consequently supporting current solid gastric cancer therapies.