995 resultados para B., A. P.
Resumo:
There is a perception that teaching space in universities is a rather scarce resource. However, some studies have revealed that in many institutions it is actually chronically under-used. Often, rooms are occupied only half the time, and even when in use they are often only half full. This is usually measured by the ‘utilization’ which is defined as the percentage of available ‘seat-hours’ that are employed. Within real institutions, studies have shown that this utilization can often take values as low as 20–40%. One consequence of such a low level of utilization is that space managers are under pressure to make more efficient use of the available teaching space. However, better management is hampered because there does not appear to be a good understanding within space management (near-term planning) of why this happens. This is accompanied, within space planning (long-term planning) by a lack of experise on how best to accommodate the expected low utilizations. This motivates our two main goals: (i) To understand the factors that drive down utilizations, (ii) To set up methods to provide better space planning. Here, we provide quantitative evidence that constraints arising from timetabling and location requirements easily have the potential to explain the low utilizations seen in reality. Furthermore, on considering the decision question ‘Can this given set of courses all be allocated in the available teaching space?’ we find that the answer depends on the associated utilization in a way that exhibits threshold behaviour: There is a sharp division between regions in which the answer is ‘almost always yes’ and those of ‘almost always no’. Through analysis and understanding of the space of potential solutions, our work suggests that better use of space within universities will come about through an understanding of the effects of timetabling constraints and when it is statistically likely that it will be possible for a set of courses to be allocated to a particular space. The results presented here provide a firm foundation for university managers to take decisions on how space should be managed and planned for more effectively. Our multi-criteria approach and new methodology together provide new insight into the interaction between the course timetabling problem and the crucial issue of space planning.
Resumo:
A standard problem within universities is that of teaching space allocation which can be thought of as the assignment of rooms and times to various teaching activities. The focus is usually on courses that are expected to fit into one room. However, it can also happen that the course will need to be broken up, or ‘split’, into multiple sections. A lecture might be too large to fit into any one room. Another common example is that of seminars or tutorials. Although hundreds of students may be enrolled on a course, it is often subdivided into particular types and sizes of events dependent on the pedagogic requirements of that particular course. Typically, decisions as to how to split courses need to be made within the context of limited space requirements. Institutions do not have an unlimited number of teaching rooms, and need to effectively use those that they do have. The efficiency of space usage is usually measured by the overall ‘utilisation’ which is basically the fraction of the available seat-hours that are actually used. A multi-objective optimisation problem naturally arises; with a trade-off between satisfying preferences on splitting, a desire to increase utilisation, and also to satisfy other constraints such as those based on event location and timetabling conflicts. In this paper, we explore such trade-offs. The explorations themselves are based on a local search method that attempts to optimise the space utilisation by means of a ‘dynamic splitting’ strategy. The local moves are designed to improve utilisation and satisfy the other constraints, but are also allowed to split, and un-split, courses so as to simultaneously meet the splitting objectives.
Resumo:
Antimicrobial peptides play an important role in host defence, particularly in the oral cavity where there is constant challenge by microorganisms. The a-defensin antimicrobial peptides comprise 30–50% of the total protein in the azurophilic granules of human neutrophils, the most abundant of which is human neutrophil peptide 1 (HNP-1). Despite its antimicrobial activity, a limiting factor in the potential therapeutic use of HNP-1 is its chemical synthesis with the correct disulphide topology. In the present study, we synthesised a range of truncated defensin analogues lacking disulphide bridges. All the analogues were modelled on the C-terminal region of HNP-1 and their antimicrobial activity was tested against a range of microorganisms, including oral pathogens. Although there was variability in the antimicrobial activity of the truncated analogues synthesised, a truncated peptide named 2Abz23S29 displayed a broad spectrum of antibacterial activity, effectively killing all the bacterial strains tested. The finding that truncated peptides, modelled on the C-terminal ß-hairpin region of HNP-1 but lacking disulphide bridges, display antimicrobial activity could aid their potential use in therapeutic interventions.
Resumo:
The potential of Raman spectroscopy for the determination of meat quality attributes has been investigated using data from a set of 52 cooked beef samples, which were rated by trained taste panels. The Raman spectra, shear force and cooking loss were measured and PLS used to correlate the attributes with the Raman data. Good correlations and standard errors of prediction were found when the Raman data were used to predict the panels' rating of acceptability of texture (R-2 = 0.71, Residual Mean Standard Error of Prediction (RMSEP)% of the mean (mu) = 15%), degree of tenderness (R-2 = 0.65, RMSEP% of mu = 18%), degree of juiciness (R-2 = 0.62, RMSEP% of mu = 16%), and overall acceptability (R-2 = 0.67, RMSEP% of mu = 11%). In contrast, the mechanically determined shear force was poorly correlated with tenderness (R-2 = 0.15). Tentative interpretation of the plots of the regression coefficients suggests that the alpha-helix to beta-sheet ratio of the proteins and the hydrophobicity of the myofibrillar environment are important factors contributing to the shear force, tenderness, texture and overall acceptability of the beef. In summary, this work demonstrates that Raman spectroscopy can be used to predict consumer-perceived beef quality. In part, this overall success is due to the fact that the Raman method predicts texture and tenderness, which are the predominant factors in determining overall acceptability in the Western world. Nonetheless, it is clear that Raman spectroscopy has considerable potential as a method for non-destructive and rapid determination of beef quality parameters.
Resumo:
A new calibration curve for the conversion of radiocarbon ages to calibrated (cal) ages has been constructed and internationally ratified to replace IntCal98, which extended from 0-24 cal kyr BP (Before Present, 0 cal BP = AD 1950). The new calibration data set for terrestrial samples extends from 0-26 cal kyr BP, but with much higher resolution beyond 11.4 cal kyr BP than IntCal98. Dendrochronologically-dated tree-ring samples cover the period from 0-12.4 cal kyr BP. Beyond the end of the tree rings, data from marine records (corals and foraminifera) are converted to the atmospheric equivalent with a site-specific marine reservoir correction to provide terrestrial calibration from 12.4-26.0 cal kyr BP. A substantial enhancement relative to IntCal98 is the introduction of a coherent statistical approach based on a random walk model, which takes into account the uncertainty in both the calendar age and the (super 14) C age to calculate the underlying calibration curve (Buck and Blackwell, this issue). The tree-ring data sets, sources of uncertainty, and regional offsets are discussed here. The marine data sets and calibration curve for marine samples from the surface mixed layer (Marine04) are discussed in brief, but details are presented in Hughen et al. (this issue a). We do not make a recommendation for calibration beyond 26 cal kyr BP at this time; however, potential calibration data sets are compared in another paper (van der Plicht et al., this issue).
Resumo:
The objective of the study is to determine the psychometric properties of the Epistemological Beliefs Questionnaire on Mathematics. 171 Secondary School Mathematics Teachers of the Central Region of Cuba participated. The results show acceptable internal consistency. The factorial structure of the scale revealed three major factors, consistent with the Model of the Three Constructs: beliefs about knowledge, about learning and teaching. Irregular levels in the development of the epistemological belief system about mathematics of these teachers were shown, with a tendency among naivety and sophistication poles. In conclusion, the questionnaire is useful for evaluating teacher’s beliefs about mathematics.
Resumo:
The purpose of this article is to analyse the assessment procedures and instruments used by teachers of Geography and History of Compulsory Secondary School (ESO) in the Region of Murcia (Spain). The data have been extracted implementing a survey technique proceeded by a descriptive analysis. The results show that teachers generally have a traditional conception of assessment, reflected in the fact that they think that assessment should not change when teaching strategies are changed or when they innovate. On the other hand, although they consider that is necessary to employ a variety of instruments to assess well and to prevent school failure, they still use exams as the most objective and essential instrument in the assessment, while they don’t apply continuous assessment, only tests in a continuous way. The implementation of similar research in other areas or in other subjects shows the existence of contrasts in teacher assessment practices.