148 resultados para scientific uncertainty
Resumo:
Disengagement of students in science and the scientific literacy of young adults are interrelated international concerns. One way to address these concerns is to engage students imaginatively in activities designed to improve their scientific literacy. Our ongoing program of research has focused on the effects of a sequence of activities that require students to transform scientific information on important issues for their communities from government websites into narrative text suitable for a lay reader. These hybridized stories we call BioStories. Students upload their stories for peer review to a dedicated website. Peer reviews are intended to help students refine their stories. Reviewing BioStories also gives students access to a wider range of scientific topics and writing styles. We have conducted separate studies with students from Grade 6, Grade 9 and Grade 12, involving case study and quasi-experimental designs. The results from the 6th grade study support the argument that writing the sequence of stories helped the students become more familiar with the scientific issue, develop a deeper understanding of related biological concepts, and improve their interest in science. Unlike the Grade 6 study, it was not possible to include a control group for the study conducted across eight 9th grade classes. Nevertheless, these results suggest that hybridized writing developed more positive attitudes toward science and science learning, particularly in terms of the students’ interest and enjoyment. In the most recent case study with Grade 12 students, we found that pride, strength, determination, interest and alertness were among the positive emotions most strongly elicited by the writing project. Furthermore, the students expressed enhanced feelings of self-efficacy in successfully writing hybridized scientific narratives in science. In this chapter, we describe the pedagogy of hybridized writing in science, overview the evidence to support this approach, and identify future developments.
Resumo:
We develop a stochastic endogenous growth model to explain the diversity in growth and inequality patterns and the non-convergence of incomes in transitional economies where an underdeveloped financial sector imposes an implicit, fixed cost on the diversification of idiosyncratic risk. In the model endogenous growth occurs through physical and human capital deepening, with the latter being the more dominant element. We interpret the fixed cost as a ‘learning by doing’ cost for entrepreneurs who undertake risk in the absence of well developed financial markets and institutions that help diversify such risk. As such, this cost may be interpreted as the implicit returns foregone due to the lack of diversification opportunities that would otherwise have been available, had such institutions been present. The analytical and numerical results of the model suggest three growth outcomes depending on the productivity differences between the projects and the fixed cost associated with the more productive project. We label these outcomes as poverty trap, dual economy and balanced growth. Further analysis of these three outcomes highlights the existence of a diversity within diversity. Specifically, within the ‘poverty trap’ and ‘dual economy’ scenarios growth and inequality patterns differ, depending on the initial conditions. This additional diversity allows the model to capture a richer range of outcomes that are consistent with the empirical experience of several transitional economies.
Resumo:
Twitter is now well-established as an important platform for real-time public communication. Twitter research continues to lag behind these developments, with many studies remaining focused on individual case studies and utilizing home-grown, idiosyncratic, non-repeatable, and non-verifiable research methodologies. While the development of a full-blown “science of Twitter” may remain illusory, it is nonetheless necessary to move beyond such individual scholarship and toward the development of more comprehensive, transferable, and rigorous tools and methods for the study of Twitter on a large scale and in close to real time.
Resumo:
Five Canadian high school Chemistry classes in one school, taught by three different teachers, studied the concepts of dynamic chemical equilibria and Le Chatelier’s Principle. Some students received traditional teacher-led explanations of the concept first and used an interactive scientific visualisation second, while others worked with the visualisation first and received the teacher-led explanation second. Students completed a test of their conceptual understanding of the relevant concepts prior to instruction, after the first instructional session and at the end of instruction. Data on students’ academic achievement (highest, middle or lowest third of the class on the mid-term exam) and gender were also collected to explore the relationship between these factors, conceptual development and instructional sequencing. Results show, within this context at least, that teaching sequence is not important in terms of students’ conceptual learning gains.
Resumo:
Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.
Resumo:
The dawn of the twenty-first century encouraged a number of scientific and technological organisations to identify what they saw as ‘Grand Challenges and Opportunities’. Issues of environment and health featured very prominently in these quite short lists, as can be seen from a sample of these challenges in Table 1. Indeed, the first two lists of challenges in Table 1 were identified as for the environment and for health, respectively.
Resumo:
This volume puts together the works of a group of distinguished scholars and active researchers in the field of media and communication studies to reflect upon the past, present, and future of new media research. The chapters examine the implications of new media technologies on everyday life, existing social institutions, and the society at large at various levels of analysis. Macro-level analyses of changing techno-social formation – such as discussions of the rise of surveillance society and the "fifth estate" – are combined with studies on concrete and specific new media phenomena, such as the rise of Pro-Am collaboration and "fan labor" online. In the process, prominent concepts in the field of new media studies, such as social capital, displacement, and convergence, are critically examined, while new theoretical perspectives are proposed and explicated. Reflecting the inter-disciplinary nature of the field of new media studies and communication research in general, the chapters interrogate into the problematic through a range of theoretical and methodological approaches. The book should offer students and researchers who are interested in the social impact of new media both critical reviews of the existing literature and inspirations for developing new research questions.
Resumo:
The ultimate goal of an access control system is to allocate each user the precise level of access they need to complete their job - no more and no less. This proves to be challenging in an organisational setting. On one hand employees need enough access to the organisation’s resources in order to perform their jobs and on the other hand more access will bring about an increasing risk of misuse - either intentionally, where an employee uses the access for personal benefit, or unintentionally, through carelessness or being socially engineered to give access to an adversary. This thesis investigates issues of existing approaches to access control in allocating optimal level of access to users and proposes solutions in the form of new access control models. These issues are most evident when uncertainty surrounding users’ access needs, incentive to misuse and accountability are considered, hence the title of the thesis. We first analyse access control in environments where the administrator is unable to identify the users who may need access to resources. To resolve this uncertainty an administrative model with delegation support is proposed. Further, a detailed technical enforcement mechanism is introduced to ensure delegated resources cannot be misused. Then we explicitly consider that users are self-interested and capable of misusing resources if they choose to. We propose a novel game theoretic access control model to reason about and influence the factors that may affect users’ incentive to misuse. Next we study access control in environments where neither users’ access needs can be predicted nor they can be held accountable for misuse. It is shown that by allocating budget to users, a virtual currency through which they can pay for the resources they deem necessary, the need for a precise pre-allocation of permissions can be relaxed. The budget also imposes an upper-bound on users’ ability to misuse. A generalised budget allocation function is proposed and it is shown that given the context information the optimal level of budget for users can always be numerically determined. Finally, Role Based Access Control (RBAC) model is analysed under the explicit assumption of administrators’ uncertainty about self-interested users’ access needs and their incentives to misuse. A novel Budget-oriented Role Based Access Control (B-RBAC) model is proposed. The new model introduces the notion of users’ behaviour into RBAC and provides means to influence users’ incentives. It is shown how RBAC policy can be used to individualise the cost of access to resources and also to determine users’ budget. The implementation overheads of B-RBAC is examined and several low-cost sub-models are proposed.
Resumo:
The effects of small changes in flight-path parameters (primary and secondary flight paths, detector angles), and of displacement of the sample along the beam axis away from its ideal position, are examined for an inelastic time-of-flight (TOF) neutron spectrometer, emphasising the deep-inelastic regime. The aim was to develop a rational basis for deciding what measured shifts in the positions of spectral peaks could be regarded as reliable in the light of the uncertainties in the calibrated flight-path parameters. Uncertainty in the length of the primary or secondary flight path has the least effect on the positions of the peaks of H, D and He, which are dominated by the accuracy of the calibration of the detector angles. This aspect of the calibration of a TOF spectrometer therefore demands close attention to achieve reliable outcomes where the position of the peaks is of significant scientific interest and is discussed in detail. The corresponding sensitivities of the position of peak of the Compton profile, J(y), to flight-path parameters and sample position are also examined, focusing on the comparability across experiments of results for H, D and He. We show that positioning the sample to within a few mm of the ideal position is required to ensure good comparability between experiments if data from detectors at high forward angles are to be reliably interpreted.
Resumo:
This paper draws upon the current situation within Japanese Higher Education. In particular the paper focuses on educational reforms and how they relate to the notions of Yutori Kyoiku which constituted a major attempt by Japanese education to develop individual student capacity. A clear subtext of the recent neo-liberal reform agenda is a desire to incorporated free-market ideals into the Japanese educational system. This paper raises several important problems connected to the reforms such as the decrease in classroom hours, changes to the contents of textbooks and a growing discrepancy in academic skills between students in different localities. These education reforms have impacted on notions of Yutori Kyoiku through the continuation of nationally standardized testing and changes directed at controlling the practices of classroom teachers. While acknowledging the current Japanese cabinet’s (DP) education policy has been inherited from an earlier LDP government, the paper points to similarities between the current reforms and the iconic Meiji era reforms of the late 1800s.
Resumo:
We consider how data from scientific research should be used for decision making in health services. Whether a hand hygiene intervention to reduce risk of nosocomial infection should be widely adopted is the case study. Improving hand hygiene has been described as the most important measure to prevent nosocomial infection. 1 Transmission of microorganisms is reduced, and fewer infections arise, which leads to a reduction in mortality2 and cost savings.3 Implementing a hand hygiene program is itself costly, so the extra investment should be tested for cost-effectiveness.4,5 The first part of our commentary is about cost-effectiveness models and how they inform decision making for health services. The second part is about how data on the effectiveness of hand hygiene programs arising from scientific studies are used, and 2 points are made: the threshold for statistical inference of .05 used to judge effectiveness studies is not important for decision making,6,7 and potentially valuable evidence about effectiveness might be excluded by decision makers because it is deemed low quality.8 The ideas put forward will help researchers and health services decision makers to appraise scientific evidence in a more powerful way.
Resumo:
New venture growth is a central topic in entrepreneurship research. Although sales growth is emerging as the most commonly used measure of growth for emerging ventures, employment growth has also been used frequently. However, empirical research demonstrates that there are only very low to moderately sized correlations between the two (Delmar et aL, 2003; Weinzimmer, et al., 1998). In addition) sales growth and employment growth respond differently to a wide variety of criteria (Baum et al., 2001; Delmar et al., 2003). In this study we use transaction cost economics (Williamson, 1996) as a theoretical base to examine transaction cost influences on the addition of new employees as emerging ventures experience sales growth. \\le theorize that transaction cost economics variables will moderate the relationship between sales growth and employment growth. W'e develop and test hypotheses related to asset specificity, behavioral uncertainty, and the influence of resource munificence on the strength of the sales growth/ employment growth relationship. Asset specificity is theorized to be a positive moderator of the relationship between sales growth and employment growth. When the behavioral uncertainty associated with adding new employees is greater than that of outsourcing or subcontracting, it is hypothesized to be a negative moderator of the sales growth/employment growth relationship. We also hypothesize that resource scarcity will strengthen those relationships.
Resumo:
Morris' (1986) analysis of the factors affecting project success and failure is considered in relation to the psychology of judgement under uncertainty. A model is proposed whereby project managers may identify the specific circumstances in which human decision-making is prone to systematic error, and hence may apply a number of de-biasing techniques.
Resumo:
Introduction. Calculating segmental (vertebral level-by-level) torso masses in Adolescent Idiopathic Scoliosis (AIS) patients allows the gravitational loading on the scoliotic spine during relaxed standing to be determined. This study used CT scans of AIS patients to measure segmental torso masses and explores how joint moments in the coronal plane are affected by changes in the position of the intervertebral joint’s axis of rotation; particularly at the apex of a scoliotic major curve. Methods. Existing low dose CT data from the Paediatric Spine Research Group was used to calculate vertebral level-by-level torso masses and joint torques occurring in the spine for a group of 20 female AIS patients (mean age 15.0 ± 2.7 years, mean Cobb angle 53 ± 7.1°). Image processing software, ImageJ (v1.45 NIH USA) was used to threshold the T1 to L5 CT images and calculate the segmental torso volume and mass corresponding to each vertebral level. Body segment masses for the head, neck and arms were taken from published anthropometric data. Intervertebral (IV) joint torques at each vertebral level were found using principles of static equilibrium together with the segmental body mass data. Summing the torque contributions for each level above the required joint, allowed the cumulative joint torque at a particular level to be found. Since there is some uncertainty in the position of the coronal plane Instantaneous Axis of Rotation (IAR) for scoliosis patients, it was assumed the IAR was located in the centre of the IV disc. A sensitivity analysis was performed to see what effect the IAR had on the joint torques by moving it laterally 10mm in both directions. Results. The magnitude of the torso masses from T1-L5 increased inferiorly, with a 150% increase in mean segmental torso mass from 0.6kg at T1 to 1.5kg at L5. The magnitudes of the calculated coronal plane joint torques during relaxed standing were typically 5-7 Nm at the apex of the curve, with the highest apex joint torque of 7Nm being found in patient 13. Shifting the assumed IAR by 10mm towards the convexity of the spine, increased the joint torque at that level by a mean 9.0%, showing that calculated joint torques were moderately sensitive to the assumed IAR location. When the IAR midline position was moved 10mm away from the convexity of the spine, the joint torque reduced by a mean 8.9%. Conclusion. Coronal plane joint torques as high as 7Nm can occur during relaxed standing in scoliosis patients, which may help to explain the mechanics of AIS progression. This study provides new anthropometric reference data on vertebral level-by-level torso mass in AIS patients which will be useful for biomechanical models of scoliosis progression and treatment. However, the CT scans were performed in supine (no gravitational load on spine) and curve magnitudes are known to be smaller than those measured in standing.