919 resultados para Probability Pattern comparison Evaluation and interpretation
Resumo:
This study examines the relationship between teacher’s use of English textbooks and the way teachers evaluate and adapt them, looking at a particular context, the Capeverdean secondary schools, specifically in Praia. The referred relationship was analyzed through teachers’ responses about how they use, evaluate and adapt their textbooks. The results of the study revealed that, on the one hand, the way teachers use their textbooks influences the way they evaluate the same textbooks; on the other hand, the use of textbooks doesn’t necessarily influence the way teachers adapt them. Moreover, the findings revealed that, in general, due to some particular constraints the Capeverdean English teachers are using their textbooks as resources, in which several textbooks are used in combination with one another. Additionally, although teachers assume that they are doing their best, they still need more confidence concerning the way they use, evaluate and adapt available textbooks. Teachers’ confidence in the way they are using their textbooks can be reinforced by establishing an intensive teacher training module on materials evaluation and adaptation, taking into account that a textbook is one of the most important tools in the process of teaching and learning. I hope that the elements presented may lead to further studies on this matter, specifically regarding textbook evaluation and adaptation.
Resumo:
This paper studies a balance whose unobservable fulcrum is not necessarilylocated at the middle of its two pans. It presents three differentmodels, showing how this lack of symmetry modifies the observation, theformalism and the interpretation of such a biased measuring device. Itargues that the biased balance can be an interesting source of inspirationfor broadening the representational theory of measurement.
Resumo:
This paper proposes an exploration of the methodology of utilityfunctions that distinguishes interpretation from representation. Whilerepresentation univocally assigns numbers to the entities of the domainof utility functions, interpretation relates these entities withempirically observable objects of choice. This allows us to makeexplicit the standard interpretation of utility functions which assumesthat two objects have the same utility if and only if the individual isindifferent among them. We explore the underlying assumptions of suchan hypothesis and propose a non-standard interpretation according towhich objects of choice have a well-defined utility although individualsmay vary in the way they treat these objects in a specific context.We provide examples of such a methodological approach that may explainsome reversal of preferences and suggest possible mathematicalformulations for further research.
Resumo:
Due to various contexts and processes, forensic science communities may have different approaches, largely influenced by their criminal justice systems. However, forensic science practices share some common characteristics. One is the assurance of a high (scientific) quality within processes and practices. For most crime laboratory directors and forensic science associations, this issue is conditioned by the triangle of quality, which represents the current paradigm of quality assurance in the field. It consists of the implementation of standardization, certification, accreditation, and an evaluation process. It constitutes a clear and sound way to exchange data between laboratories and enables databasing due to standardized methods ensuring reliable and valid results; but it is also a means of defining minimum requirements for practitioners' skills for specific forensic science activities. The control of each of these aspects offers non-forensic science partners the assurance that the entire process has been mastered and is trustworthy. Most of the standards focus on the analysis stage and do not consider pre- and post-laboratory stages, namely, the work achieved at the investigation scene and the evaluation and interpretation of the results, intended for intelligence beneficiaries or for court. Such localized consideration prevents forensic practitioners from identifying where the problems really lie with regard to criminal justice systems. According to a performance-management approach, scientific quality should not be restricted to standardized procedures and controls in forensic science practice. Ensuring high quality also strongly depends on the way a forensic science culture is assimilated (into specific education training and workplaces) and in the way practitioners understand forensic science as a whole.
Resumo:
We study the motion of an unbound particle under the influence of a random force modeled as Gaussian colored noise with an arbitrary correlation function. We derive exact equations for the joint and marginal probability density functions and find the associated solutions. We analyze in detail anomalous diffusion behaviors along with the fractal structure of the trajectories of the particle and explore possible connections between dynamical exponents of the variance and the fractal dimension of the trajectories.
Resumo:
Background: To compare the characteristics and prognostic features of ischemic stroke in patients with diabetes and without diabetes, and to determine the independent predictors of in-hospital mortality in people with diabetes and ischemic stroke.Methods: Diabetes was diagnosed in 393 (21.3%) of 1,840 consecutive patients with cerebral infarction included in a prospective stroke registry over a 12-year period. Demographic characteristics, cardiovascular risk factors, clinical events, stroke subtypes, neuroimaging data, and outcome in ischemic stroke patients with and without diabetes were compared. Predictors of in-hospital mortality in diabetic patients with ischemic stroke were assessed by multivariate analysis. Results: People with diabetes compared to people without diabetes presented more frequently atherothrombotic stroke (41.2% vs 27%) and lacunar infarction (35.1% vs 23.9%) (P < 0.01). The in-hospital mortality in ischemic stroke patients with diabetes was 12.5% and 14.6% in those without (P = NS). Ischemic heart disease, hyperlipidemia, subacute onset, 85 years old or more, atherothrombotic and lacunar infarcts, and thalamic topography were independently associated with ischemic stroke in patients with diabetes, whereas predictors of in-hospital mortality included the patient's age, decreased consciousness, chronic nephropathy, congestive heart failure and atrial fibrillation. Conclusion: Ischemic stroke in people with diabetes showed a different clinical pattern from those without diabetes, with atherothrombotic stroke and lacunar infarcts being more frequent. Clinical factors indicative of the severity of ischemic stroke available at onset have a predominant influence upon in-hospital mortality and may help clinicians to assess prognosis more accurately.
Resumo:
Research in autophagy continues to accelerate,(1) and as a result many new scientists are entering the field. Accordingly, it is important to establish a standard set of criteria for monitoring macroautophagy in different organisms. Recent reviews have described the range of assays that have been used for this purpose.(2,3) There are many useful and convenient methods that can be used to monitor macroautophagy in yeast, but relatively few in other model systems, and there is much confusion regarding acceptable methods to measure macroautophagy in higher eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers of autophagosomes versus those that measure flux through the autophagy pathway; thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from fully functional autophagy that includes delivery to, and degradation within, lysosomes (in most higher eukaryotes) or the vacuole (in plants and fungi). Here, we present a set of guidelines for the selection and interpretation of the methods that can be used by investigators who are attempting to examine macroautophagy and related processes, as well as by reviewers who need to provide realistic and reasonable critiques of papers that investigate these processes. This set of guidelines is not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to verify an autophagic response.
Resumo:
In 2001, it became evident that the domiciliary care nurses needed a tool to assist them in treating patients with chronic wounds. A protocol was therefore developed which could be used not only by the nurses but also by doctors and other health care professionals working in home care. As a parallel measure, a network of nurses specialised in wound care and available for advice and consultation was established.
Resumo:
AIM: To determine the extent drinking patterns (at the individual and country level) are associated with alcohol-related consequences over and above the total alcohol the person consumes. METHODS: Hierarchical linear models were estimated based on general population surveys conducted in 18 countries participating in the GENACIS project. RESULTS: In general, the positive association between drinking pattern scores and alcohol-related consequences was found at both the individual and country levels, independent of volume of drinking. In addition, a significant interaction effect indicated that the more detrimental the country's drinking pattern, the less steep the association between the volume of drinking and its consequences. CONCLUSION: Drinking patterns have an independent impact on consequences over and above the relationship between volume and consequences.
Resumo:
The determination of line crossing sequences between rollerball pens and laser printers presents difficulties that may not be overcome using traditional techniques. This research aimed to study the potential of digital microscopy and 3-D laser profilometry to determine line crossing sequences between a toner and an aqueous ink line. Different paper types, rollerball pens, and writing pressure were tested. Correct opinions of the sequence were given for all case scenarios, using both techniques. When the toner was printed before the ink, a light reflection was observed in all crossing specimens, while this was never observed in the other sequence types. The 3-D laser profilometry, more time-consuming, presented the main advantage of providing quantitative results. The findings confirm the potential of the 3-D laser profilometry and demonstrate the efficiency of digital microscopy as a new technique for determining the sequence of line crossings involving rollerball pen ink and toner. With the mass marketing of laser printers and the popularity of rollerball pens, the determination of line crossing sequences between such instruments is encountered by forensic document examiners. This type of crossing presents difficulties with optical microscopic line crossing techniques involving ballpoint pens or gel pens and toner (1-4). Indeed, the rollerball's aqueous ink penetrates through the toner and is absorbed by the fibers of the paper, leaving the examiner with the impression that the toner is above the ink even when it is not (5). Novotny and Westwood (3) investigated the possibility of determining aqueous ink and toner crossing sequences by microscopic observation of the intersection before and after toner removal. A major disadvantage of their study resides in destruction of the sample by scraping off the toner line to see what was underneath. The aim of this research was to investigate the ways to overcome these difficulties through digital microscopy and three-dimensional (3-D) laser profilometry. The former was used as a technique for the determination of sequences between gel pen and toner printing strokes, but provided less conclusive results than that of an optical stereomicroscope (4). 3-D laser profilometry, which allows one to observe and measure the topography of a surface, has been the subject of a number of recent studies in this area. Berx and De Kinder (6) and Schirripa Spagnolo (7,8) have tested the application of laser profilometry to determine the sequence of intersections of several lines. The results obtained in these studies overcome disadvantages of other methods applied in this area, such as scanning electron microscope or the atomic force microscope. The main advantages of 3-D laser profilometry include the ease of implementation of the technique and its nondestructive nature, which does not require sample preparation (8-10). Moreover, the technique is reproducible and presents a high degree of freedom in the vertical axes (up to 1000 μm). However, when the paper surface presents a given roughness, if the pen impressions alter the paper with a depth similar to the roughness of medium, the results are not always conclusive (8). It becomes difficult in this case to distinguish which characteristics can be imputed to the pen impressions or the quality of the paper surface. This important limitation is assessed by testing different types of paper of variable quality (of different grammage and finishing) and the writing pressure. The authors will therefore assess the limits of 3-D laser profilometry technique and determine whether the method can overcome such constraints. Second, the authors will investigate the use of digital microscopy because it presents a number of advantages: it is efficient, user-friendly, and provides an objective evaluation and interpretation.
Resumo:
Stream degradation is the action of deepening the stream bed and widening the banks due to the increasing velocity of water flow. Degradation is pervasive in channeled streams found within the deep to moderately deep loess regions of the central United States. Of all the streams, however, the most severe and widespread entrenchment occurs in western Iowa streams that are tributaries to the Missouri River. In September 1995 the Iowa Department of Transportation awarded a grant to Golden Hills Resource Conservation and Development, Inc. The purpose of the grant, HR-385 "Stream Stabilization in Western Iowa: Structure Evaluation and Design Manual", was to provide an assessment of the effectiveness and costs of various stabilization structures in controlling erosion on channeled streams. A review of literature, a survey of professionals, field observations and an analysis of the data recorded on fifty-two selected structures led to the conclusions presented in the project's publication, Design Manual, Streambed Degradation and Streambank Widening in Western Iowa. Technical standards and specifications for the design and construction of stream channel stabilization structures are included in the manual. Additional information on non-structural measures, monitoring and evaluation of structures, various permit requirements and further resources are also included. Findings of the research project and use and applications of the Design Manual were presented at two workshops in the Loess Hills region. Participants in these workshops included county engineers, private contractors, state and federal agency personnel, elected officials and others. The Design Manual continues to be available through Golden Hills Resource Conservation and Development.
Resumo:
This research consisted of five laboratory experiments designed to address the following two objectives in an integrated analysis: (1) To discriminate between the symbol Stop Ahead warning sign and a small set of other signs (which included the word-legend Stop Ahead sign); and (2) To analyze sign detection, recognizability, and processing characteristics by drivers. A set of 16 signs was used in each of three experiments. A tachistoscope was used to display each sign image to a respondent for a brief interval in a controlled viewing experiment. The first experiment was designed to test detection of a sign in the driver's visual field; the second experiment was designed to test the driver's ability to recognize a given sign in the visual field; and the third experiment was designed to test the speed and accuracy of a driver's response to each sign as a command to perform a driving action. A fourth experiment tested the meanings drivers associated with an eight-sign subset of the 16 signs used in the first three experiments. A fifth experiment required all persons to select which (if any) signs they considered to be appropriate for use on two scale model county road intersections. The conclusions are that word-legend Stop Ahead signs are more effective driver communication devices than symbol stop-ahead signs; that it is helpful to drivers to have a word plate supplementing the symbol sign if a symbol sign is used; and that the guidance in the Manual on Uniform Traffic Control Devices on the placement of advance warning signs should not supplant engineering judgment in providing proper sign communication at an intersection.
Resumo:
EEG recordings are usually corrupted by spurious extra-cerebral artifacts, which should be rejected or cleaned up by the practitioner. Since manual screening of human EEGs is inherently error prone and might induce experimental bias, automatic artifact detection is an issue of importance. Automatic artifact detection is the best guarantee for objective and clean results. We present a new approach, based on the time–frequency shape of muscular artifacts, to achieve reliable and automatic scoring. The impact of muscular activity on the signal can be evaluated using this methodology by placing emphasis on the analysis of EEG activity. The method is used to discriminate evoked potentials from several types of recorded muscular artifacts—with a sensitivity of 98.8% and a specificity of 92.2%. Automatic cleaning ofEEGdata are then successfully realized using this method, combined with independent component analysis. The outcome of the automatic cleaning is then compared with the Slepian multitaper spectrum based technique introduced by Delorme et al (2007 Neuroimage 34 1443–9).
Resumo:
The State of Iowa has too many roads. Although ranking thirty-fourth in population, twenty-fifth in area, and twentieth in motor vehicle registration, it ranks seventh in the nation in miles of rural roads. In 1920 when Iowa's rural population was 1,528,000, there were 97,440 miles of secondary roads. In 1960 with rural population down 56 percent to 662,000, there were 91,000 miles of secondary roads--a 7 percent decrease. The question has been asked: "Who are these 'service roads' serving?" This excess mileage tends to dissipate road funds at a critical time of increasing public demand for better and safer roads.