585 resultados para Deterministic Trend.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigates a way to systematically integrate information literacy (IL) into an undergraduate academic programme and develops a model for integrating information literacy across higher education curricula. Curricular integration of information literacy in this study means weaving information literacy into an academic curriculum. In the associated literature, it is also referred to as the information literacy embedding approach or the intra-curricular approach. The key findings identified from this study are presented in 4 categories: the characteristics of IL integration; the key stakeholders in IL integration; IL curricular design strategies; and the process of IL curricular integration. Three key characteristics of the curricular integration of IL are identified: collaboration and negotiation, contextualisation and ongoing interaction with information. The key stakeholders in the curricular integration of IL are recognised as the librarians, the course coordinators and lecturers, the heads of faculties or departments, and the students. Some strategies for IL curricular design include: the use of IL policies and standards in IL curricular design; the combination of face to face and online teaching as an emerging trend; the use of IL assessment tools which play an important role in IL integration. IL can be integrated into the intended curriculum (what an institution expects its students to learn), the offered curriculum (what the teachers teach) and the received curriculum (what students actually learn). IL integration is a process of negotiation, collaboration and the implementation of the intended curriculum. IL can be integrated at different levels of curricula such as: institutional, faculty, departmental, course and class curriculum levels. Based on these key findings, an IL curricular integration model is developed. The model integrates curriculum, pedagogy and learning theories, IL theories, IL guidelines and the collaboration of multiple partners. The model provides a practical approach to integrating IL into multiple courses across an academic degree. The development of the model was based on the IL integration experiences of various disciplines in three universities and the implementation experience of an engineering programme at another university; thus it may be of interest to other disciplines. The model has the potential to enhance IL teaching and learning, curricular development and to implement graduate attributes in higher education. Sociocultural theories are applied to the research process and IL curricular design of this study. Sociocultural theories describe learning as being embedded within social events and occurring as learners interact with other people, objects, and events in a collaborative environment. Sociocultural theories are applied to explore how academic staff and librarians experience the curricular integration of IL; they also support collaboration in the curricular integration of IL and the development of an IL integration model. This study consists of two phases. Phase I (2007) was the interview phase where both academic staff and librarians at three IL active universities were interviewed. During this phase, attention was paid specifically to the practical process of curricular integration of IL and IL activity design. Phase II, the development phase (2007-2008), was conducted at a fourth university. This phase explores the systematic integration of IL into an engineering degree from Year 1 to Year 4. Learning theories such as sociocultural theories, Bloom’s Taxonomy and IL theories are used in IL curricular development. Based on the findings from both phases, an IL integration model was developed. The findings and the model contribute to IL education, research and curricular development in higher education. The sociocultural approach adopted in this study also extends the application of sociocultural theories to the IL integration process and curricular design in higher education.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis traces the influence of postmodernism on picturebooks. Through a review of current scholarship on both postmodernism and postmodern literature it examines the multiple ways in which picturebooks have responded to the influence of postmodernism. The thesis is predominantly located in the field of Cultural and Literary Studies, which informs the ways in which children’s literature is positioned within contemporary culture and how it responds to the influences which shape its production and reception. Cultural and Literary Studies also offers a useful theoretical frame for analysing issues of textuality, ideology, and originality, as well as social and political comment in the focus texts. The thesis utilises the theoretical contributions by, in particular, Linda Hutcheon, Brian McHale, and Fredric Jameson as well as reference to children’s literature studies. This thesis makes a significant contribution to the development of an understanding of the place of the postmodern picturebook within the cultural context of postmodernism. It adds to the field of children’s literature research through an awareness of the (continuing) evolution of the postmodern picturebook particularly as the current scholarship on the postmodernism picturebook does not engage with the changing form and significance of the postmodern picturebook to the same extent as this thesis. This study is significant from a methodological perspective as it draws on a wide range of theoretical perspectives across literary studies, visual semiotics, philosophy, cultural studies, and history to develop a tripartite methodological framework that utilises the methods of postclassical narratology, semiotics, and metafictive strategies to carry out the textual analysis of the focus texts. The three analysis chapters examine twenty-two picturebooks in detail with respect to the ways in which the conventions of narrative are subverted and how dominant discourses are interrogated. Chapter 4: Subverting Narrative Processes includes analysis of narrative point of view, modes of representation, and characters and the problems of identity and subjectivity. Chapter 5: Challenging Truth, History, and Unity examines questions of ontology, the difficulties of representing history, and addresses issues of difference and ‘otherness’. The final textual analysis chapter, Chapter 6: Engaging with Postmodernity, critiques texts which engage with issues of globalisation, mass media, and consumerism. Brief discussion of a further fifteen picturebooks throughout the thesis provides additional support. Children’s texts have a tradition of being both resistant and compliant. Its resistance has made a space for the development of the postmodern picturebook; its compliance is evident in its tendency to take a route around a truly radical or iconoclastic position. This thesis posits that children’s postmodern picturebooks adopt what suits their form and purposes by drawing from and reflecting on some influences of postmodernism while disregarding those that seem irrelevant to its direction. Furthermore, the thesis identifies a shift in the focus of a number of postmodern picturebooks produced since the turn of the twenty-first century. This trend has seen a shift from texts which interrogate discourses of liberal humanism to those that engage with aspects of postmodernity. These texts, postmodernesque picturebooks, offer contradictory perspectives on aspects of society emanating from the rise in global trends mentioned above.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anna Hirsch and Clare Dixon (2008, 190) state that creative writers’ ‘obsession with storytelling…might serve as an interdisciplinary tool for evaluating oral histories.’ This paper enters a dialogue with Hirsch and Dixon’s statement by documenting an interview methodology for a practice-led PhD project, The Artful Life Story: Oral History and Fiction, which investigates the fictionalising of oral history. ----- ----- Alistair Thomson (2007, 62) notes the interdisciplinary nature of oral history scholarship from the 1980s onwards. As a result, oral histories are being used and understood in a variety of arts-based settings. In such contexts, oral histories are not valued so much for their factual content but as sources that are at once dynamic, emotionally authentic and open to a multiplicity of interpretations. How can creative writers design and conduct interviews that reflect this emphasis? ----- ----- The paper briefly maps the growing trend of using oral histories in fiction and ethnographic novels, in order to establish the need to design interviews for arts-based contexts. I describe how I initially designed the interviews to suit the aims of my practice. Once in the field, however, I found that my original methods did not account for my experiences. I conclude with the resulting reflection and understanding that emerged from these problematic encounters, focusing on the technique of steered monologue (Scagliola 2010), sometimes referred to as the Biographic Narrative Interpretative Method (Wengraf 2001, Jones 2006).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The economic environment of today can be characterized as highly dynamic and competitive if not being in a constant flux. Globalization and the Information Technology (IT) revolution are perhaps the main contributing factors to this observation. While companies have to some extent adapted to the current business environment, new pressures such as the recent increase in environmental awareness and its likely effects on regulations are underway. Hence, in the light of market and competitive pressures, companies must constantly evaluate and if necessary update their strategies to sustain and increase the value they create for shareholders (Hunt and Morgan, 1995; Christopher and Towill, 2002). One way to create greater value is to become more efficient in producing and delivering goods and services to customers, which can lead to a strategy known as cost leadership (Porter, 1980). Even though Porter (1996) notes that in the long run cost leadership may not be a sufficient strategy for competitive advantage, operational efficiency is certainly necessary and should therefore be on the agenda of every company. ----- ----- ----- Better workflow management, technology, and resource utilization can lead to greater internal operational efficiency, which explains why, for example, many companies have recently adopted Enterprise Resource Planning (ERP) Systems: integrated softwares that streamline business processes. However, as today more and more companies are approaching internal operational excellence, the focus for finding inefficiencies and cost saving opportunities is moving beyond the boundaries of the firm. Today many firms in the supply chain are engaging in collaborative relationships with customers, suppliers, and third parties (services) in an attempt to cut down on costs related to for example, inventory, production, as well as to facilitate synergies. Thus, recent years have witnessed fluidity and blurring regarding organizational boundaries (Coad and Cullen, 2006). ----- ----- ----- The Information Technology (IT) revolution of the late 1990’s has played an important role in bringing organizations closer together. In their efforts to become more efficient, companies first integrated their information systems to speed up transactions such as ordering and billing. Later collaboration on a multidimensional scale including logistics, production, and Research & Development became evident as companies expected substantial benefits from collaboration. However, one could also argue that the recent popularity of the concepts falling under Supply Chain Management (SCM) such as Vendor Managed Inventory, Collaborative Planning, Replenishment, and Forecasting owe to the marketing efforts of software vendors and consultants who provide these solutions. Nevertheless, reports from professional organizations as well as academia indicate that the trend towards interorganizational collaboration is gaining wider ground. For example, the ARC Advisory Group, a research organization on supply chain solutions, estimated that the market for SCM, which includes various kinds of collaboration tools and related services, is going to grow at an annual rate of 7.4% during the years 2004-2008, reaching to $7.4 billion in 2008 (Engineeringtalk 2004).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION. Following anterior thoracoscopic instrumentation and fusion for the treatment of thoracic AIS, implant related complications have been reported as high as 20.8%. Currently the magnitudes of the forces applied to the spine during anterior scoliosis surgery are unknown. The aim of this study was to measure the segmental compressive forces applied during anterior single rod instrumentation in a series of adolescent idiopathic scoliosis patients. METHODS. A force transducer was designed, constructed and retrofitted to a surgical cable compression tool, routinely used to apply segmental compression during anterior scoliosis correction. Transducer output was continuously logged during the compression of each spinal joint, the output at completion converted to an applied compression force using calibration data. The angle between adjacent vertebral body screws was also measured on intra-operative frontal plane fluoroscope images taken both before and after each joint compression. The difference in angle between the two images was calculated as an estimate for the achieved correction at each spinal joint. RESULTS. Force measurements were obtained for 15 scoliosis patients (Aged 11-19 years) with single thoracic curves (Cobb angles 47˚- 67˚). In total, 95 spinal joints were instrumented. The average force applied for a single joint was 540 N (± 229 N)ranging between 88 N and 1018 N. Experimental error in the force measurement, determined from transducer calibration was ± 43 N. A trend for higher forces applied at joints close to the apex of the scoliosis was observed. The average joint correction angle measured by fluoroscope imaging was 4.8˚ (±2.6˚, range 0˚-12.6˚). CONCLUSION. This study has quantified in-vivo, the intra-operative correction forces applied by the surgeon during anterior single rod instrumentation. This data provides a useful contribution towards an improved understanding of the biomechanics of scoliosis correction. In particular, this data will be used as input for developing patient-specific finite element simulations of scoliosis correction surgery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a worldwide trend towards rapidly growing defined contribution pension funds in terms of assets and membership, and the choices available to individuals. This has shifted the decisionmaking responsibility to fund members for managing the investment of their retirement savings. This change has given rise to a phenomenon where most superannuation fund members are responsible for either actively choosing or passively relying on their funds’ default investment options. Prior research identifies that deficiencies in financial literacy is one of the causes of inertia in financial decision-making and findings from international and Australian studies show that financial illiteracy is wide-spread. Given the potential significant economic and social consequences of poor financial decision-making in superannuation matters, this paper proposes a framework by which the various demographic, social and contextual factors that influence fund members’ financial literacy and its association with investment choice decisions are explored. Enhanced theoretical and empirical understanding of the factors that are associated with active/passive investment choice decisions would enable development of well-targeted financial education programs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Asylum is being gradually denuded of the national institutional mechanisms (judicial, legislative and administrative) that provide the framework for a fair and effective asylum hearing. In this sense, there is an ongoing ‘denationalization’ or ‘deformalization’ of the asylum process. This chapter critically examines one of the linchpins of this trend: the erection of pre-entry measures at ports of embarkation in order to prevent asylum seekers from physically accessing the territory of the state. Pre-entry measures comprise the core requirement that foreigners possess an entry visa granting permission to enter the state of destination. Visa requirements are increasingly implemented by immigration officials posted abroad or by officials of transit countries pursuant to bilateral agreements (so-called ‘juxtaposed’ immigration controls). Private carriers, which are subject to sanctions if they bring persons to a country who do not have permission to enter, also engage in a form of de facto immigration control on behalf of states. These measures constitute a type of ‘externalized’ or ‘exported’ border that pushes the immigration boundaries of the state as far from its physical boundaries as possible. Pre-entry measures have a crippling impact on the ability of asylum seekers to access the territory of states to claim asylum. In effect, states have ‘externalized’ asylum by replacing the legal obligation on states to protect refugees arriving at ports of entry with what are perceived to be no more than moral obligations towards asylum seekers arriving at the external border of the state.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background In order to provide insights into the complex biochemical processes inside a cell, modelling approaches must find a balance between achieving an adequate representation of the physical phenomena and keeping the associated computational cost within reasonable limits. This issue is particularly stressed when spatial inhomogeneities have a significant effect on system's behaviour. In such cases, a spatially-resolved stochastic method can better portray the biological reality, but the corresponding computer simulations can in turn be prohibitively expensive. Results We present a method that incorporates spatial information by means of tailored, probability distributed time-delays. These distributions can be directly obtained by single in silico or a suitable set of in vitro experiments and are subsequently fed into a delay stochastic simulation algorithm (DSSA), achieving a good compromise between computational costs and a much more accurate representation of spatial processes such as molecular diffusion and translocation between cell compartments. Additionally, we present a novel alternative approach based on delay differential equations (DDE) that can be used in scenarios of high molecular concentrations and low noise propagation. Conclusions Our proposed methodologies accurately capture and incorporate certain spatial processes into temporal stochastic and deterministic simulations, increasing their accuracy at low computational costs. This is of particular importance given that time spans of cellular processes are generally larger (possibly by several orders of magnitude) than those achievable by current spatially-resolved stochastic simulators. Hence, our methodology allows users to explore cellular scenarios under the effects of diffusion and stochasticity in time spans that were, until now, simply unfeasible. Our methodologies are supported by theoretical considerations on the different modelling regimes, i.e. spatial vs. delay-temporal, as indicated by the corresponding Master Equations and presented elsewhere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Up to one-third of people affected by cancer experience ongoing psychological distress and would benefit from screening followed by an appropriate level of psychological intervention. This rarely occurs in routine clinical practice due to barriers such as lack of time and experience. This study investigated the feasibility of community-based telephone helpline operators screening callers affected by cancer for their level of distress using a brief screening tool (Distress Thermometer), and triaging to the appropriate level of care using a tiered model. Methods Consecutive cancer patients and carers who contacted the helpline from September-December 2006 (n = 341) were invited to participate. Routine screening and triage was conducted by helpline operators at this time. Additional socio-demographic and psychosocial adjustment data were collected by telephone interview by research staff following the initial call. Results The Distress Thermometer had good overall accuracy in detecting general psychosocial morbidity (Hospital Anxiety and Depression Scale cut-off score ≥ 15) for cancer patients (AUC = 0.73) and carers (AUC = 0.70). We found 73% of participants met the Distress Thermometer cut-off for distress caseness according to the Hospital Anxiety and Depression Scale (a score ≥ 4), and optimal sensitivity (83%, 77%) and specificity (51%, 48%) were obtained with cut-offs of ≥ 4 and ≥ 6 in the patient and carer groups respectively. Distress was significantly associated with the Hospital Anxiety and Depression Scale scores (total, as well as anxiety and depression subscales) and level of care in cancer patients, as well as with the Hospital Anxiety and Depression Scale anxiety subscale for carers. There was a trend for more highly distressed callers to be triaged to more intensive care, with patients with distress scores ≥ 4 more likely to receive extended or specialist care. Conclusions Our data suggest that it was feasible for community-based cancer helpline operators to screen callers for distress using a brief screening tool, the Distress Thermometer, and to triage callers to an appropriate level of care using a tiered model. The Distress Thermometer is a rapid and non-invasive alternative to longer psychometric instruments, and may provide part of the solution in ensuring distressed patients and carers affected by cancer are identified and supported appropriately.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction The ability to screen blood of early stage operable breast cancer patients for circulating tumour cells is of potential importance for identifying patients at risk of developing distant relapse. We present the results of a study of the efficacy of the immunobead RT-PCR method in identifying patients with circulating tumour cells. Results Immunomagnetic enrichment of circulating tumour cells followed by RT-PCR (immunobead RT-PCR) with a panel of five epithelial specific markers (ELF3, EPHB4, EGFR, MGB1 and TACSTD1) was used to screen for circulating tumour cells in the peripheral blood of 56 breast cancer patients. Twenty patients were positive for two or more RT-PCR markers, including seven patients who were node negative by conventional techniques. Significant increases in the frequency of marker positivity was seen in lymph node positive patients, in patients with high grade tumours and in patients with lymphovascular invasion. A strong trend towards improved disease free survival was seen for marker negative patients although it did not reach significance (p = 0.08). Conclusion Multi-marker immunobead RT-PCR analysis of peripheral blood is a robust assay that is capable of detecting circulating tumour cells in early stage breast cancer patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we study both the level of Value-at-Risk (VaR) disclosure and the accuracy of the disclosed VaR figures for a sample of US and international commercial banks. To measure the level of VaR disclosures, we develop a VaR Disclosure Index that captures many different facets of market risk disclosure. Using panel data over the period 1996–2005, we find an overall upward trend in the quantity of information released to the public. We also find that Historical Simulation is by far the most popular VaR method. We assess the accuracy of VaR figures by studying the number of VaR exceedances and whether actual daily VaRs contain information about the volatility of subsequent trading revenues. Unlike the level of VaR disclosure, the quality of VaR disclosure shows no sign of improvement over time. We find that VaR computed using Historical Simulation contains very little information about future volatility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND - High-density lipoprotein (HDL) protects against arterial atherothrombosis, but it is unknown whether it protects against recurrent venous thromboembolism. METHODS AND RESULTS - We studied 772 patients after a first spontaneous venous thromboembolism (average follow-up 48 months) and recorded the end point of symptomatic recurrent venous thromboembolism, which developed in 100 of the 772 patients. The relationship between plasma lipoprotein parameters and recurrence was evaluated. Plasma apolipoproteins AI and B were measured by immunoassays for all subjects. Compared with those without recurrence, patients with recurrence had lower mean (±SD) levels of apolipoprotein AI (1.12±0.22 versus 1.23±0.27 mg/mL, P<0.001) but similar apolipoprotein B levels. The relative risk of recurrence was 0.87 (95% CI, 0.80 to 0.94) for each increase of 0.1 mg/mL in plasma apolipoprotein AI. Compared with patients with apolipoprotein AI levels in the lowest tertile (<1.07 mg/mL), the relative risk of recurrence was 0.46 (95% CI, 0.27 to 0.77) for the highest-tertile patients (apolipoprotein AI >1.30 mg/mL) and 0.78 (95% CI, 0.50 to 1.22) for midtertile patients (apolipoprotein AI of 1.07 to 1.30 mg/mL). Using nuclear magnetic resonance, we determined the levels of 10 major lipoprotein subclasses and HDL cholesterol for 71 patients with recurrence and 142 matched patients without recurrence. We found a strong trend for association between recurrence and low levels of HDL particles and HDL cholesterol. CONCLUSIONS - Patients with high levels of apolipoprotein AI and HDL have a decreased risk of recurrent venous thromboembolism. © 2007 American Heart Association, Inc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From their very outset, the disciplines of social science have claimed a need for interdisciplinarity. Proponents of new disciplines have also claimed the whole of human activity as their domain, whilst simultaneously emphasising the need for increased specialisation. Critical social analysis attempts to repair the flaws of specialisation. In this chapter, I argue that the trend towards academic specialisation in social science is most usefully viewed from the perspective of evaluative meaning, and that each new discipline, in emphasising one aspect of a broken conception of humanity, necessarily emphasises one aspect of an already broken conception of value. Critical discourse analysis, qua critical social analysis, may therefore benefit by firstly proceeding from the perspective of evaluative meaning to understand the dynamics of social change and overcome the challenges posed by centuries of intensive specialisation in social science.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As the societal awareness on sustainability is gaining momentum worldwide, the higher education sector is expected to take the lead in education, research and the promotion of sustainable development. Universities have the diversity of skills and knowledge to explore new concepts and issues, the academic freedom to offer unbiased observations, and the capacity to engage in experimentation for solutions. There is a global trend that universities have realized and responded to sustainability challenge. By adopting green technologies, buildings on university campuses have the potential to offer highly productive and green environments for a quality learning experience for students, while minimising environmental impacts. Despite the potential benefits and metaphorical link to sustainability, few universities have moved towards implementing Green Roof and Living Wall on campuses widely, which have had more successful applications in commercial and residential buildings. Few past research efforts have examined the fundamental barriers to the implementation of sustainable projects on campuses from organizational level. To address this deficiency, an on-going research project is undertaken by Queensland University of Technology in Australia. The research is aimed at developing a comprehensive framework to facilitate better decision making for the promotion of Green Roof and Living Wall application on campuses. It will explore and highlight organizational factors as well as investigate and emphasize project delivery issues. Also, the critical technical indicators for Green Roof and Living Wall implementation will be identified. The expected outcome of this research has the potential to enhance Green Roof and Living Wall delivery in Australian universities, as a vital step towards realizing sustainability in higher education sectors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The flood flow in urbanised areas constitutes a major hazard to the population and infrastructure as seen during the summer 2010-2011 floods in Queensland (Australia). Flood flows in urban environments have been studied relatively recently, although no study considered the impact of turbulence in the flow. During the 12-13 January 2011 flood of the Brisbane River, some turbulence measurements were conducted in an inundated urban environment in Gardens Point Road next to Brisbane's central business district (CBD) at relatively high frequency (50 Hz). The properties of the sediment flood deposits were characterised and the acoustic Doppler velocimeter unit was calibrated to obtain both instantaneous velocity components and suspended sediment concentration in the same sampling volume with the same temporal resolution. While the flow motion in Gardens Point Road was subcritical, the water elevations and velocities fluctuated with a distinctive period between 50 and 80 s. The low frequency fluctuations were linked with some local topographic effects: i.e, some local choke induced by an upstream constriction between stairwells caused some slow oscillations with a period close to the natural sloshing period of the car park. The instantaneous velocity data were analysed using a triple decomposition, and the same triple decomposition was applied to the water depth, velocity flux, suspended sediment concentration and suspended sediment flux data. The velocity fluctuation data showed a large energy component in the slow fluctuation range. For the first two tests at z = 0.35 m, the turbulence data suggested some isotropy. At z = 0.083 m, on the other hand, the findings indicated some flow anisotropy. The suspended sediment concentration (SSC) data presented a general trend with increasing SSC for decreasing water depth. During a test (T4), some long -period oscillations were observed with a period about 18 minutes. The cause of these oscillations remains unknown to the authors. The last test (T5) took place in very shallow waters and high suspended sediment concentrations. It is suggested that the flow in the car park was disconnected from the main channel. Overall the flow conditions at the sampling sites corresponded to a specific momentum between 0.2 to 0.4 m2 which would be near the upper end of the scale for safe evacuation of individuals in flooded areas. But the authors do not believe the evacuation of individuals in Gardens Point Road would have been safe because of the intense water surges and flow turbulence. More generally any criterion for safe evacuation solely based upon the flow velocity, water depth or specific momentum cannot account for the hazards caused by the flow turbulence, water depth fluctuations and water surges.