677 resultados para Almost Kneser Subgroup
Resumo:
To investigate the migraine locus around the C19p13 region through analysis of the NOTCH3 gene (C19p13.2-p13.1), previously shown to be a gene involved in CADASIL and the TNFSF7 gene (C19p13), homologous to the ligands of TNF-alpha and TNF-beta, genes that have previously been associated with migraine. The NOTCH3 gene was analysed by sequencing all exons with known CADASIL mutations in a typical (non-familial hemiplegic) migraine family (MF1) that has previously been shown to be linked to C19p13. The TNFSF7 gene was investigated through SNP association analysis using a matched case-control migraine population. NOTCH3 gene sequencing results for affected members of MF1 proved to be negative for all known sequence variants giving rise to mutations for CADASIL. TNFSF7 gene chi-square results showed non-significant P values across all populations tested against controls, except for the MO subgroup which displayed a possible association with the TNFSF7 SNP (genotype, allele analysis P = 0.036, P = 0.017 respectively). Our results suggest that common migraine is not caused by any known CADASIL mutations in the NOTCH3 gene of interest. However, the TNFSF7 gene displayed signs of involvement in a MO affected population and indicates that further independent studies of this marker are warranted.
Resumo:
This practice-led project has two outcomes: a collection of short stories titled 'Corkscrew Section', and an exegesis. The short stories combine written narrative with visual elements such as images and typographic devices, while the exegesis analyses the function of these graphic devices within adult literary fiction. My creative writing explores a variety of genres and literary styles, but almost all of the stories are concerned with fusing verbal and visual modes of communication. The exegesis adopts the interpretive paradigm of multimodal stylistics, which aims to analyse graphic devices with the same level of detail as linguistic analysis. Within this framework, the exegesis compares and extends previous studies to develop a systematic method for analysing how the interactions between language, images and typography create meaning within multimodal literature.
Resumo:
I grew up in academic heaven. At least for me it was. Not only was Sweden in the late 1980s paradise for any kind of empirical research, with rich and high-quality business statistics being made available to researchers without them having to sign away their lives; 70+ percent response rates achieved in mail surveys to almost any group (if you knew how to do them), and boards of directors opening their doors to more qualitatively orientated researchers to sit in during their meetings. In addition, I perceived an environment with a very high degree of academic freedom, letting me do whatever I found interesting and important. I’m sure for others it was sheer hell, with very unclear career paths and rules of the game. Career progression (something which rarely entered my mind) meant that you tried as best you could and then you put all your work – reports, books, book chapters, conference papers, maybe even published articles – in a box and had some external committee of professors look at it. If you were lucky they liked what they saw for whatever reasons their professorial wisdom dictated, and you got hired or promoted. If you were not so lucky you wouldn’t get the job or the promotion, without quite knowing why. So people could easily imagine an old boys club – whose members were themselves largely unproven in international, peer review publishing – picking whoever they wanted by whatever criteria they choose to apply. Neither the fact that assessors were external nor the presence of an appeals system might have completely appeased your suspicious and skeptical mind, considering the balance of power.
Resumo:
PURPOSE Current research on errors in health care focuses almost exclusively on system and clinician error. It tends to exclude how patients may create errors that influence their health. We aimed to identify the types of errors that patients can contribute and help manage, especially in primary care. METHODS Eleven nominal group interviews of patients and primary health care professionals were held in Auckland, New Zealand, during late 2007. Group members reported and helped to classify types of potential error by patients. We synthesized the ideas that emerged from the nominal groups into a taxonomy of patient error. RESULTS Our taxonomy is a 3-level system encompassing 70 potential types of patient error. The first level classifies 8 categories of error into 2 main groups: action errors and mental errors. The action errors, which result in part or whole from patient behavior, are attendance errors, assertion errors, and adherence errors. The mental errors, which are errors in patient thought processes, comprise memory errors, mindfulness errors, misjudgments, and—more distally—knowledge deficits and attitudes not conducive to health. CONCLUSION The taxonomy is an early attempt to understand and recognize how patients may err and what clinicians should aim to influence so they can help patients act safely. This approach begins to balance perspectives on error but requires further research. There is a need to move beyond seeing patient, clinician, and system errors as separate categories of error. An important next step may be research that attempts to understand how patients, clinicians, and systems interact to cocreate and reduce errors.
Resumo:
One set of public institutions that has seen growing discussion about the transformative impact of new media technologies has been universities. The higher education sector, historically one of the more venerable and stable areas of public life, is now the subject of almost continuous speculation about whether it can continue in its current form during the 21st century. Digital media technologies are often seen as being at the forefront of such changes. It has been widely noted that moves towards a knowledge economy generates ‘skills-biased technological change’, that places a premium upon higher education qualifications, and that this earnings gap remains despite the continuing increase in the number of university graduates. As the demand for higher education continues to grow worldwide, there are new discussions about whether technologically-mediated education through new forms such as Massively Open Online Courses (MOOCs) are broadening access to quality learning, or severing the vital connection between teacher and student seen as integral to the learning process. This paper critically appraises such debates in the context of early 21st century higher education. It will discuss ten drivers of change in higher education, many of which are related to themes discussed elsewhere in this book, such as the impact of social media, globalization, and a knowledge economy. It will also consider the issues raised in navigating such developments from the perspective of the ‘Five P’s’: practical issues; personal issues; pedagogical issues; policy issues; and philosophical issues. It also includes a critical evaluation of MOOCs from the point of view of their educational qualities. It will conclude with the observation that while universities will continue to play a significant – and perhaps growing – role in the economy, society and culture, the issues raised about what Clayton Christensen and Henry Eyring term the ‘disruptive university’ (Christensen and Eyring 2011) are nonetheless pressing ones, and that cost and policy pressures in particular are likely to generate significant institutional transformations in higher education worldwide.
Resumo:
Sweden’s protest against the Vietnam War was given tangible form in 1969 through the decision to give economic aid to the Government of North Vietnam. The main outcome was an integrated pulp and paper mill in the Vinh Phu Province north-west of Hanoi. Known as Bai Bang after its location, the mill became the most costly, one of the longest lasting and the most controversial project in the history of Swedish development cooperation. In 1996 Bai Bang produced at its full capacity. Today the mill is exclusively managed and staffed by the Vietnamese and there are plans for future expansion. At the same time a substantial amount of money has been spent to reach these achievements. Looking back at the cumbersome history of the project the results are against many’s expectations. To learn more about the conditions for sustainable development Sida commissioned two studies of the Bai Bang project. Together they touch upon several important issues in development cooperation over a period of almost 30 years: the change of aid paradigms over time, the role of foreign policy in development cooperation, cultural obstacles, recipient responsibility versus donor led development etc. The two studies were commissioned by Sida’s Department for Evaluation and Internal Audit which is an independent department reporting directly to Sida’s Board of Directors. One study assesses the financial and economic viability of the pulp and paper mill and the broader development impact of the project in Vietnam. It has been carried out by the Centre for International Economics, an Australian private economic research agency. The other study analyses the decision-making processes that created and shaped the project over a period of two decades, and reflects on lessons from the project for development cooperation in general. This study has been carried out by the Chr. Michelsen Institute, a Norweigan independent research institution.
Resumo:
Nanowires (NWs) have attracted appealing and broad application owing to their remarkable mechanical, optical, electrical, thermal and other properties. To unlock the revolutionary characteristics of NWs, a considerable body of experimental and theoretical work has been conducted. However, due to the extremely small dimensions of NWs, the application and manipulation of the in situ experiments involve inherent complexities and huge challenges. For the same reason, the presence of defects appears as one of the most dominant factors in determining their properties. Hence, based on the experiments' deficiency and the necessity of investigating different defects' influence, the numerical simulation or modelling becomes increasingly important in the area of characterizing the properties of NWs. It has been noted that, despite the number of numerical studies of NWs, significant work still lies ahead in terms of problem formulation, interpretation of results, identification and delineation of deformation mechanisms, and constitutive characterization of behaviour. Therefore, the primary aim of this study was to characterize both perfect and defected metal NWs. Large-scale molecular dynamics (MD) simulations were utilized to assess the mechanical properties and deformation mechanisms of different NWs under diverse loading conditions including tension, compression, bending, vibration and torsion. The target samples include different FCC metal NWs (e.g., Cu, Ag, Au NWs), which were either in a perfect crystal structure or constructed with different defects (e.g. pre-existing surface/internal defects, grain/twin boundaries). It has been found from the tensile deformation that Young's modulus was insensitive to different styles of pre-existing defects, whereas the yield strength showed considerable reduction. The deformation mechanisms were found to be greatly influenced by the presence of defects, i.e., different defects acted in the role of dislocation sources, and many affluent deformation mechanisms had been triggered. Similar conclusions were also obtained from the compressive deformation, i.e., Young's modulus was insensitive to different defects, but the critical stress showed evident reduction. Results from the bending deformation revealed that the current modified beam models with the considerations of surface effect, or both surface effect and axial extension effect were still experiencing certain inaccuracy, especially for the NW with ultra small cross-sectional size. Additionally, the flexural rigidity of the NW was found to be insensitive to different pre-existing defects, while the yield strength showed an evident decrease. For the resonance study, the first-order natural frequency of the NW with pre-existing surface defects was almost the same as that from the perfect NW, whereas a lower first-order natural frequency and a significantly degraded quality factor was observed for NWs with grain boundaries. Most importantly, the <110> FCC NWs were found to exhibit a novel beat phenomenon driven by a single actuation, which was resulted from the asymmetry in the lattice spacing in the (110) plane of the NW cross-section, and expected to exert crucial impacts on the in situ nanomechanical measurements. In particular, <110> Ag NWs with rhombic, truncated rhombic, and triangular cross-sections were found to naturally possess two first-mode natural frequencies, which were envisioned with applications in NEMS that could operate in a non-planar regime. The torsion results revealed that the torsional rigidity of the NW was insensitive to the presence of pre-existing defects and twin boundaries, but received evident reduction due to grain boundaries. Meanwhile, the critical angle decreased considerably for defected NWs. This study has provided a comprehensive and deep investigation on the mechanical properties and deformation mechanisms of perfect and defected NWs, which will greatly extend and enhance the existing knowledge and understanding of the properties/performance of NWs, and eventually benefit the realization of their full potential applications. All delineated MD models and theoretical analysis techniques that were established for the target NWs in this research are also applicable to future studies on other kinds of NWs. It has been suggested that MD simulation is an effective and excellent tool, not only for the characterization of the properties of NWs, but also for the prediction of novel or unexpected properties.
Resumo:
Research background For almost 80 years the Chuck Taylor (or Chuck T's) All Star basketball shoe has been an iconic item of fashion apparel. The Chuck T's were first designed in 1921 by Converse, an American shoe company and over the decades they became a popular item not purely for sports and athletic purposes but rather evolved into the shoe of choice for many subcultural groups as a fashion item. In some circles the Chuck Taylor is still seen as the "coolest" sneaker of all time - one which will never go out of fashion regardless of changing trends. With over 600 millions pairs sold all over the world since its release, the Converse shoe is representative of not only a fashion culture - but also of a consumption culture - that evolved as the driving force behind the massive growth of the Western economic system during the 20th Century. Artisan Gallery (Brisbane), in conjunction with the exhibition Reboot: Function, Fashion and the Sneaker, a history of the sneaker, selected 20 designers to customise and re-design the classic Converse Chuck Taylor All Stars shoe and in doing so highlighted the diversity of forms possible for creative outcomes. As Artisan Gallery Curator Kirsten Fitzpatrick states “We were expecting people to draw and paint on them. Instead, we had shoes... mounted as trophies.." referring to the presentation of "Converse Consumption". The exhibition ran from 21 June – 16 August 2012: Research question The Chuck T’s is one of many overwhelmingly commercially successful designs of the last century. Nowadays we are faced with the significant problems of overconsumption and the stress this causes on the natural ecosystem; and on people as a result. As an active member of the industrial design fraternity – a discipline that sits at the core of this problem - how can I use this opportunity to comment on the significant issue of consumption? An effective way to do this was to associate consumption of goods with consumption of sugar. There are significant similarities between our ceaseless desires to consume products and our fervent need to consume indulgent sweet foods. Artisan Statement Delicious, scrumptious, delectable... your pupils dilate, your blood pressure spikes, your liver goes into overdrive. Immediately, your brain cuts off the adenosine receptors, preventing drowsiness. Your body increases dopamine production, in-turn stimulating the pleasure receptors in your brain. Your body absorbs all the sweetness and turns it into fat – while all the nutrients that you actually require are starting to be destroyed, about to be expelled. And this is only after one bite! After some time though, your body comes crashing back to earth. You become irritable and begin to feel sluggish. Your eyelids seem heavy while your breathing pattern changes. Your body has consumed all the energy and destroyed all available nutrients. You literally begin to shut down. These are the physiological effects of sugar consumption. A perfect analogy for our modern day consumer driven world. Enjoy your dessert! Research contribution “Converse Consumption” contributes to the conversation regarding over-consumption by compelling people to reflect on their consumption behaviour through the reconceptualising of the deconstructed Chuck T’s in an attractive edible form. By doing so the viewer has to deal with the desire to consume the indulgent looking dessert with the contradictory fact that it is comprised of a pair of shoes. The fact that the shoes are Chuck T’s make the effect even more powerful due to their iconic status. These clashing motivations are what make “Converse Consumption” a bizarre yet memorable experience. Significance The exhibition was viewed by an excess of 1000 people and generated exceptional media coverage and public exposure/impact. As Artisan Gallery Curator Kirsten Fitzpatrick states “20 of Brisbane's best designers were given the opportunity to customise their own Converse Sneakers, with The Converse Blank Canvas Project.” And to be selected in this category demonstrates the calibre of importance for design prominence.
Resumo:
The life course of Australian researchers includes regular funding applications, which incur large personal and time costs. We previously estimated that Australian researchers spent 550 years preparing 3,727 proposals for the 2012 NHMRC Project Grant funding round, at an estimated annual salary cost of AU$66 million. Despite the worldwide importance of funding rounds, there is little evidence on what researchers think of the application process. We conducted a web-based survey of Australian researchers (May–July 2013) asking about their experience with NHMRC Project Grants. Almost all researchers (n=224 at 31 May) supported changes to the application (96%) and peer-review (88%) processes; 73% supported the introduction of shorter initial Expressions of Interest; and half (50%) provided extensive comments on the NHMRC processes. Researchers agreed preparing their proposals always took top priority over other work (97%) and personal (87%) commitments. More than half (57%) provided extensive comments on the ongoing personal impact of concurrent grant-writing and holiday seasons on family, children and other relationships. Researchers with experience on Grant Review Panels (34%) or as External Reviewers (78%) reported many sections of the proposals were rarely or never read, which suggests these sections could be cut with no impact on the quality of peer review. Our findings provide evidence on the experience of Australian researchers as applicants. The process of preparing, submitting and reviewing proposals could be streamlined to minimise the burden on applicants and peer reviewers, giving Australian researchers more time to work on actual research and be with their families.
Resumo:
Recent road safety statistics show that the decades-long fatalities decreasing trend is stopping and stagnating. Statistics further show that crashes are mostly driven by human error, compared to other factors such as environmental conditions and mechanical defects. Within human error, the dominant error source is perceptive errors, which represent about 50% of the total. The next two sources are interpretation and evaluation, which accounts together with perception for more than 75% of human error related crashes. Those statistics show that allowing drivers to perceive and understand their environment better, or supplement them when they are clearly at fault, is a solution to a good assessment of road risk, and, as a consequence, further decreasing fatalities. To answer this problem, currently deployed driving assistance systems combine more and more information from diverse sources (sensors) to enhance the driver's perception of their environment. However, because of inherent limitations in range and field of view, these systems' perception of their environment remains largely limited to a small interest zone around a single vehicle. Such limitations can be overcomed by increasing the interest zone through a cooperative process. Cooperative Systems (CS), a specific subset of Intelligent Transportation Systems (ITS), aim at compensating for local systems' limitations by associating embedded information technology and intervehicular communication technology (IVC). With CS, information sources are not limited to a single vehicle anymore. From this distribution arises the concept of extended or augmented perception. Augmented perception allows extending an actor's perceptive horizon beyond its "natural" limits not only by fusing information from multiple in-vehicle sensors but also information obtained from remote sensors. The end result of an augmented perception and data fusion chain is known as an augmented map. It is a repository where any relevant information about objects in the environment, and the environment itself, can be stored in a layered architecture. This thesis aims at demonstrating that augmented perception has better performance than noncooperative approaches, and that it can be used to successfully identify road risk. We found it was necessary to evaluate the performance of augmented perception, in order to obtain a better knowledge on their limitations. Indeed, while many promising results have already been obtained, the feasibility of building an augmented map from exchanged local perception information and, then, using this information beneficially for road users, has not been thoroughly assessed yet. The limitations of augmented perception, and underlying technologies, have not be thoroughly assessed yet. Most notably, many questions remain unanswered as to the IVC performance and their ability to deliver appropriate quality of service to support life-saving critical systems. This is especially true as the road environment is a complex, highly variable setting where many sources of imperfections and errors exist, not only limited to IVC. We provide at first a discussion on these limitations and a performance model built to incorporate them, created from empirical data collected on test tracks. Our results are more pessimistic than existing literature, suggesting IVC limitations have been underestimated. Then, we develop a new CS-applications simulation architecture. This architecture is used to obtain new results on the safety benefits of a cooperative safety application (EEBL), and then to support further study on augmented perception. At first, we confirm earlier results in terms of crashes numbers decrease, but raise doubts on benefits in terms of crashes' severity. In the next step, we implement an augmented perception architecture tasked with creating an augmented map. Our approach is aimed at providing a generalist architecture that can use many different types of sensors to create the map, and which is not limited to any specific application. The data association problem is tackled with an MHT approach based on the Belief Theory. Then, augmented and single-vehicle perceptions are compared in a reference driving scenario for risk assessment,taking into account the IVC limitations obtained earlier; we show their impact on the augmented map's performance. Our results show that augmented perception performs better than non-cooperative approaches, allowing to almost tripling the advance warning time before a crash. IVC limitations appear to have no significant effect on the previous performance, although this might be valid only for our specific scenario. Eventually, we propose a new approach using augmented perception to identify road risk through a surrogate: near-miss events. A CS-based approach is designed and validated to detect near-miss events, and then compared to a non-cooperative approach based on vehicles equiped with local sensors only. The cooperative approach shows a significant improvement in the number of events that can be detected, especially at the higher rates of system's deployment.
Resumo:
As we race towards a new era, rapid change of conventional models has become the norm. Just as technology has etched itself to the core of society, the sheer quantity of student devices connecting to university networks presents a sector wide challenge coinciding almost perfectly with many universities creating technology rich learning spaces. New fears include future proofing. It is not just a matter of technology becoming outdated. In seeking to accommodate the teaching styles and experience of staff across diverse faculties, is this technology simply too vanilla to meet their needs as they become increasingly skilled and inspired by technology’s potential? Through the early findings of a study into staff use of technology within Queensland University of Technology's next generation collaborative learning spaces, this paper explores whether the answers lie in a model presented by students equipping themselves with the tools they need to learn in the 21st century.
Resumo:
The use of mobile phones while driving is more prevalent among young drivers—a less experienced cohort with elevated crash risk. The objective of this study was to examine and better understand the reaction times of young drivers to a traffic event originating in their peripheral vision whilst engaged in a mobile phone conversation. The CARRS-Q Advanced Driving Simulator was used to test a sample of young drivers on various simulated driving tasks, including an event that originated within the driver’s peripheral vision, whereby a pedestrian enters a zebra crossing from a sidewalk. Thirty-two licensed drivers drove the simulator in three phone conditions: baseline (no phone conversation), hands-free and handheld. In addition to driving the simulator each participant completed questionnaires related to driver demographics, driving history, usage of mobile phones while driving, and general mobile phone usage history. The participants were 21 to 26 years old and split evenly by gender. Drivers’ reaction times to a pedestrian in the zebra crossing were modelled using a parametric accelerated failure time (AFT) duration model with a Weibull distribution. Also tested where two different model specifications to account for the structured heterogeneity arising from the repeated measures experimental design. The Weibull AFT model with gamma heterogeneity was found to be the best fitting model and identified four significant variables influencing the reaction times, including phone condition, driver’s age, license type (Provisional license holder or not), and self-reported frequency of usage of handheld phones while driving. The reaction times of drivers were more than 40% longer in the distracted condition compared to baseline (not distracted). Moreover, the impairment of reaction times due to mobile phone conversations was almost double for provisional compared to open license holders. A reduction in the ability to detect traffic events in the periphery whilst distracted presents a significant and measurable safety concern that will undoubtedly persist unless mitigated.
Resumo:
Background The Environments for Healthy Living (EFHL) study is a repeated sample, longitudinal birth cohort in South East Queensland, Australia. We describe the sample characteristics and profile of maternal, household, and antenatal exposures. Variation and data stability over recruitment years were examined. Methods Four months each year from 2006, pregnant women were recruited to EFHL at routine antenatal visits on or after 24 weeks gestation, from three public maternity hospitals. Participating mothers completed a baseline questionnaire on individual, familial, social and community exposure factors. Perinatal data were extracted from hospital birth records. Descriptive statistics and measures of association were calculated comparing the EFHL birth sample with regional and national reference populations. Data stability of antenatal exposure factors was assessed across five recruitment years (2006–2010 inclusive) using the Gamma statistic for ordinal data and chi-squared for nominal data. Results Across five recruitment years 2,879 pregnant women were recruited which resulted in 2904 live births with 29 sets of twins. EFHL has a lower representation of early gestational babies, fewer still births and a lower percentage of low birth weight babies, when compared to regional data. The majority of women (65%) took a multivitamin supplement during pregnancy, 47% consumed alcohol, and 26% reported having smoked cigarettes. There were no differences in rates of a range of antenatal exposures across five years of recruitment, with the exception of increasing maternal pre-pregnancy weight (p=0.0349), decreasing rates of high maternal distress (p=0.0191) and decreasing alcohol consumption (p<0.0001). Conclusions The study sample is broadly representative of births in the region and almost all factors showed data stability over time. This study, with repeated sampling of birth cohorts over multiple years, has the potential to make important contributions to population health through evaluating longitudinal follow-up and within cohort temporal effects.
Resumo:
Layers (about 60-100 μm thick) of almost pure BaCuO2 (BC1), as determined using X-ray diffractometry (XRD) and scanning electron microscopy (SEM), coat the surfaces of YBa2Cu3O7-x (Y123) samples partial melt processed using a single-zone vertical furnace. The actual Cu/Ba ratio of the BC1 phase is 1.2-1.3 as determined using energy dispersive X-ray spectrometry (EDS). The nominally BC1 phase displays an exsolution of BC1.5 or BC2 in the form of thin plates (about 50-100 nm thick) along {100}-type cleavage planes or facets. The exsolved phase also fills cracks within the BC1 layer that require it to be in a molten state at some stage of processing. The samples were influenced by Pt contamination from the supporting wire, which may have stabilised the BC1.5 phase. Many of the Y123 grains have the same morphology as the exsolution domains, and run nearly parallel to the thin plates of the exsolved phases, strongly indicating that Y123 nucleation took place at the interface between the BC1 and the BC1.5 or BC2 exsolved phases. The network of nearly parallel exsolved 'channels' provides a matrix and a mechanism through which a high degree of local texture can be initiated in the material.
Resumo:
In this paper we explore the relationship between monthly random breath testing (RBT) rates (per 1000 licensed drivers) and alcohol-related traffic crash (ARTC) rates over time, across two Australian states: Queensland and Western Australia. We analyse the RBT, ARTC and licensed driver rates across 12 years; however, due to administrative restrictions, we model ARTC rates against RBT rates for the period July 2004 to June 2009. The Queensland data reveals that the monthly ARTC rate is almost flat over the five year period. Based on the results of the analysis, an average of 5.5 ARTCs per 100,000 licensed drivers are observed across the study period. For the same period, the monthly rate of RBTs per 1000 licensed drivers is observed to be decreasing across the study with the results of the analysis revealing no significant variations in the data. The comparison between Western Australia and Queensland shows that Queensland's ARTC monthly percent change (MPC) is 0.014 compared to the MPC of 0.47 for Western Australia. While Queensland maintains a relatively flat ARTC rate, the ARTC rate in Western Australia is increasing. Our analysis reveals an inverse relationship between ARTC RBT rates, that for every 10% increase in the percentage of RBTs to licensed driver there is a 0.15 decrease in the rate of ARTCs per 100,000 licenced drivers. Moreover, in Western Australia, if the 2011 ratio of 1:2 (RBTs to annual number of licensed drivers) were to double to a ratio of 1:1, we estimate the number of monthly ARTCs would reduce by approximately 15. Based on these findings we believe that as the number of RBTs conducted increases the number of drivers willing to risk being detected for drinking driving decreases, because the perceived risk of being detected is considered greater. This is turn results in the number of ARTCs diminishing. The results of this study provide an important evidence base for policy decisions for RBT operations.