451 resultados para ADMINISTERED MORPHINE
Resumo:
and sexual violence on the social adjustment of Grade 8 and 9 school children in the state of Tripura, India. The study participants, 160 boys and 160 girls, were randomly selected from classes in eight English and Bengali medium schools in Agartala city, Tripura. Data were collected using a self-administered Semi-structured Questionnaire for Children/Students and a Social Adjustment Inventory which were custom-made for the study based on measures in the extant research adapted for the Indian context. Findings revealed that students experienced physical (21.9%), psychological (20.9%), and sexual (18.1%) violence at home, and 29.7% of the children had witnessed family violence. Boys were more often victims of physical and psychological violence while girls were more often victims of sexual violence. The social adjustment scores of school children who experienced violence, regardless of the nature of the violence, was significantly lower when compared with scores of those who had not experienced violence (p<0.001). Social adjustment was poorer for girls than boys (p<0.001). The study speaks in favour of early detection and intervention for all child maltreatment subtypes and for children exposed to interparental violence, and highlights the crucial role of schools and school psychology in addressing the problem.
Resumo:
Objective To examine the extent to which the odds of birth, pregnancy, or adverse birth outcomes are higher among women aged 28 to 36 years who use fertility treatment compared with untreated women. Design Prospective, population-based. Setting Not applicable. Patient(s) Participants in the ALSWH born in 1973 to 1978 who reported on their infertility and use of in vitro fertilization (IVF) or ovulation induction (OI). Intervention(s) Postal survey questionnaires administered as part of ALSWH. Main Outcome Measure(s) Among women treated with IVF or OI and untreated women, the odds of birth outcomes estimated by use of adjusted logistic regression modeling. Result(s) Among 7,280 women, 18.6% (n = 1,376) reported infertility. Half (53.0%) of the treated women gave birth compared with 43.8% of untreated women. Women with prior parity were less likely to use IVF compared with nulliparous women. Women using IVF or OI, respectively, were more likely to have given birth after treatment or be pregnant compared with untreated women. Women using IVF or OI were as likely to have ectopic pregnancies, stillbirths, or premature or low birthweight babies as untreated women. Conclusion(s) More than 40% of women aged 28–36 years reporting a history of infertility can achieve births without using treatment, indicating they are subfertile rather than infertile.
Resumo:
This paper presents a summary of the key findings of the TTF TPACK Survey developed and administered for the Teaching the Teachers for the Future (TTF) Project implemented in 2011. The TTF Project, funded by an Australian Government ICT Innovation Fund grant, involved all 39 Australian Higher Education Institutions which provide initial teacher education. TTF data collections were undertaken at the end of Semester 1 (T1) and at the end of Semester 2 (T2) in 2011. A total of 12881 participants completed the first survey (T1) and 5809 participants completed the second survey (T2). Groups of like-named items from the T1 survey were subject to a battery of complementary data analysis techniques. The psychometric properties of the four scales: Confidence - teacher items; Usefulness - teacher items; Confidence - student items; Usefulness- student items, were confirmed both at T1 and T2. Among the key findings summarised, at the national level, the scale: Confidence to use ICT as a teacher showed measurable growth across the whole scale from T1 to T2, and the scale: Confidence to facilitate student use of ICT also showed measurable growth across the whole scale from T1 to T2. Additional key TTF TPACK Survey findings are summarised.
Resumo:
Purpose: Myopia is a common eye disorder affecting up to 90% of children in South East Asia and 30% of the population worldwide. Myopia of high severity is a leading cause of blindness around the world (4th to 5th most common). Changes and remodelling of the sclera i.e. increase cellular proliferation & increase protein synthesis within scleral cells (↑ scleral DNA) and thinning and lose of extracellular matrix of sclera (↓ scleral GAG synthesis) have been linked to myopic eye growth in animal models. Signals acting on the sclera are thought to originate in the retina, and are modulated by the retinal pigment epithelium (RPE) with limited evidence suggesting that the RPE can modify scleral cell growth in culture. However, the mechanism of retinal signal transmission and the role of posterior eye cup tissue, including the RPE, in mediating changes in scleral fibroblast growth during myopia development are unclear. Retinal transmitter systems are critically involved in pathways regulating eye growth, which ultimately lead to alterations in the sclera if eye size is to change. A dopaminergic agonist and muscarinic antagonists decrease the proliferation of scleral chondrocytes when co-cultured with chick’s retinal pigment epithelium (RPE). GABA receptors have recently been localised to chick sclera. We therefore hypothesised that posterior eye cup tissue from myopic eyes would stimulate and from hyperopic eyes would inhibit growth of scleral fibroblasts in vitro and that GABAergic agents could directly interact with scleral cells or indirectly modify the effects of myopic and hyperopic posterior eye cup tissue on scleral fibroblast growth. Method: Fibroblastic cells obtained from 8-day-old chick sclera were used to establish cell banks. Two major experiments were performed. Experiment 1: To determine if posterior eye cup tissues from myopic eye stimulates and hyperopic eye inhibits scleral cell proliferation, when co-cultured with scleral cells in vitro. This study comprised two linked experiments, i) monocular visual treatments of FDM (form-deprivation myopia), LIM (lens-induced myopia) and LIH (lens-induced hyperopia) with assessment of the effect of full punch eye cup tissue on DNA and GAG synthesis by cultured chick scleral fibroblasts, and ii) binocular visual treatments comprising LIM and LIH with assessment of the effect of individual layers of eye cup tissues (neural retina, RPE and choroid) on cultured chick scleral fibroblasts. Visual treatment was applied for 3 days. Experiment 2: To determine the direct interaction of GABA agents on scleral cell growth and to establish whether GABA agents modify the stimulatory/inhibitory effect of myopic and hyperopic posterior eye cup tissues on cultured scleral cell growth in vitro. Two linked experiments were performed. i) GABA agonists (muscimol and baclofen) and GABA antagonists (bicuculine (-), CGP46381 and TPMPA) were added to scleral cell culture medium to determine their direct effect on scleral cells. ii) GABAergic agents (agonists and antagonists) were administered to scleral fibroblasts co-cultured with posterior eye cup tissue (retina, RPE, retina/RPE, RPE/choroid). Ocular tissues were obtained from chick eyes wearing +15D (LIH) or -15D lenses (LIM) for 3 days. In both experiments, tissues were added to hanging cell culture insert (pore size 1.0ìm) placed over each well of 24 well plates while scleral cells were cultured in DMEM/F12, Glutamax (Gibco) plus 10% FBS and penicillin/streptomycin (50U/ml)) and fungizone (1.25ug/ml) (Gibco), at seeding density of 30,000 cells/well at the bottom of the well and allowed to grow for 3 days. Scleral cells proliferation rate throughout the study was evaluated by determining GAG and DNA content of scleral cells using Dimethylmethylene blue (DMMB) dye and Quant-iTTm Pico Green® dsDNA reagent respectively. Results and analysis: Based on DNA and GAG content, there was no significant difference in tissue effect of LIM and LIH eyes on scleral fibroblast growth (DNA: 8.4 ± 1.1μg versus 9.3 ± 2.3 μg, p=0.23; GAG: 10.13 ± 1.4 μg versus 12.67 ± 1.2 μg, F2,23=6.16, p=0.0005) when tissues were obtained from monocularly treated chick eyes (FDM or +15D lens or -15D lens over right eyes with left eyes untreated) and co-cultured as full punch. When chick eyes were treated binocularly with -15D lens (LIM) right eye and +15D lens (LIH) left eyes and tissue layers were separated, the retina from LIM eyes did not stimulate scleral cell proliferation compared to LIH eyes (DNA: 27.2 ± 6.7 μg versus 23.2 ± 1.5 μg, p=0.23; GAG: 28.1 ±3.7 μg versus 28.7 ± 4.2 μg, p=0.21). Similarly, the LIH and LIM choroid did not produce a differential effect based on DNA (LIM 46.9 ± 6.4 μg versus LIH 53.5 ± 4.7 μg, p=0.18), however the choroid from LIH eyes induced higher scleral GAG content than from LIM eyes (32.5 ± 6.7 μg versus 18.9 ± 1.2 μg, p=0.023). In contrast, the RPE from LIM eyes caused a significant increase in fibroblast proliferation whereas the RPE from LIH eyes was relatively inhibitory (72.4 ± 6.3 μg versus 27.9 ± 2.3 μg, F1, 6=69.99, p=0.0005). GAG data were opposite to DNA data e.g. the RPE from LIH eyes increased (33.7 ± 7.9 μg) while the RPE from LIM eyes decreased (28.2 ± 3.0 μg) scleral cell growth (F1, 6=13.99, p=0.010). Based on DNA content, GABA agents had a small direct effect on scleral cell growth; GABA agonists increased (21.4 ± 1.0% and 18.3 ± 1.0% with muscimol and baclofen, p=0.0021), whereas GABA antagonists decreased fibroblast proliferation (-23.7 ± 0.9% with bicuculine & CGP46381 and -28.1 ± 0.5% with TPMPA, p=0.0004). GABA agents also modified the effect of LIM and LIH tissues (p=0.0005).The increase in proliferation rate of scleral fibroblasts co-cultured with tissues (RPE, retina, RPE/retina and RPE/choroid) from LIM treated eyes was enhanced by GABA agonists (muscimol: 27.4 ± 1.2%, 35.8 ± 1.6%, 8.4 ± 0.3% and 11.9 ± 0.6%; baclofen: 27.0 ± 1.0%, 15.8 ± 1.5%, 16.8 ± 1.2% and 15.4 ± 0.4%, p=0.014) whereas GABA antagonists further reduced scleral fibroblasts growth (bicuculine: -52.5 ± 2.5%, -36.9 ± 1.4%, -37.5 ± 0.6% and -53.7 ± 0.9%; TPMPA: 57.3 ± 1.3%, -15.7 ± 1.2%, -33.5 ± 0.4% and -45.9 ± 1.5%; CGP46381: -51.9 ± 1.6%, -28.5 ± 1.5%, -25.4 ± 2.0% and -45.5 ± 1.9% respectively, p=0.0034). GAG data were opposite to DNA data throughout the experiment e.g. GABA agonists further inhibited while antagonists relatively enhanced scleral fibroblasts growth for both LIM and LIH tissue co-culture. The effect of GABA agents was relatively lower (p=0.0004) for tissue from LIH versus LIM eyes but was in a similar direction. There was a significant drug effect on all four tissue types e.g. RPE, retina, RPE/retina and RPE/choroid for both LIM and LIH tissue co-culture (F20,92=3.928, p=0.0005). However, the effect of GABA agents was greatest in co-culture with RPE tissue (F18,36=4.865, p=0.0005). Summary and Conclusion: 1) Retinal defocus signals are transferred to RPE and choroid which then exert their modifying effect on scleral GAG and DNA synthesis either through growth stimulating factors or directly interacting with scleral cells in process of scleral remodeling during LIM and LIH visual conditions. 2) GABAergic agents affect the proliferation of scleral fibroblasts both directly and when co-cultured with ocular tissues in vitro.
Resumo:
Background: Effective self-management of diabetes is essential for the reduction of diabetes-related complications, as global rates of diabetes escalate. Methods: Randomised controlled trial. Adults with type 2 diabetes (n = 120), with HbA1c greater than or equal to 7.5 %, were randomly allocated (4 × 4 block randomised block design) to receive an automated, interactive telephone-delivered management intervention or usual routine care. Baseline sociodemographic, behavioural and medical history data were collected by self-administered questionnaires and biological data were obtained during hospital appointments. Health-related quality of life (HRQL) was measured using the SF-36. Results: The mean age of participants was 57.4 (SD 8.3), 63 % of whom were male. There were no differences in demographic, socioeconomic and behavioural variables between the study arms at baseline. Over the six-month period from baseline, participants receiving the Australian TLC (Telephone-Linked Care) Diabetes program showed a 0.8 % decrease in geometric mean HbA1c from 8.7 % to 7.9 %, compared with a 0.2 % HbA1c reduction (8.9 % to 8.7 %) in the usual care arm (p = 0.002). There was also a significant improvement in mental HRQL, with a mean increase of 1.9 in the intervention arm, while the usual care arm decreased by 0.8 (p = 0.007). No significant improvements in physical HRQL were observed. Conclusions: These analyses indicate the efficacy of the Australian TLC Diabetes program with clinically significant post-intervention improvements in both glycaemic control and mental HRQL. These observed improvements, if supported and maintained by an ongoing program such as this, could significantly reduce diabetes-related complications in the longer term. Given the accessibility and feasibility of this kind of program, it has strong potential for providing effective, ongoing support to many individuals with diabetes in the future.
Resumo:
Rationale Although the advent of atypical, second-generation antipsychotics (SGAs) has resulted in reduced likelihood of akathisia, this adverse effect remains a problem. Extrapyramidal adverse effects are associated with increased drug occupancy of dopamine 2 receptors (DRD2). The A1 allele of the DRD2/ANKK1,rs1800497, is associated with decreased striatal DRD2 density. Objectives The aim of this study was to identify whether the A1(T) allele of the DRD2/ANKK1 was associated with akathisia (measured with the Barnes Akathisia Rating Scale) in a clinical sample of 234 patients treated with antipsychotics. Results Definite akathisia (a score≥ 2 for the global clinical assessment of akathisia) was significantly less common in subjects prescribed SGAs (16.8 %) than those prescribed FGAs (47.6%), p<0.0001. Overall, 24.1% of A1+ (A1A2/A1A1) patients treated with SGAs had akathisia compared to 10.8% of A1- (A2A2) patients. A1+ (A1A2/A1A1) patients administered SGAs also had higher global clinical assessment of akathisia scores than A1- subjects (p=0.01). SGAs maintained their advantage over FGAs regarding akathisia even in A1+ patients treated with SGAs. Conclusions These results strongly suggest that A1+ variants of the DRD2/ANKK1 Taq1A allele confer risk for akathisia in patients treated with SGAs and may explain inconsistencies across prior studies comparing FGAs and SGAs.
Resumo:
This investigation examined physiological and performance effects of cooling on recovery of medium-fast bowlers in the heat. Eight, medium-fast bowlers completed two randomised trials, involving two sessions completed on consecutive days (Session 1: 10-overs and Session 2: 4-overs) in 31 ± 3°C and 55 ± 17% relative humidity. Recovery interventions were administered for 20 min (mixed-method cooling vs. control) after Session 1. Measures included bowling performance (ball speed, accuracy, run-up speeds), physical demands (global positioning system, counter-movement jump), physiological (heart rate, core temperature, skin temperature, sweat loss), biochemical (creatine kinase, C-reactive protein) and perceptual variables (perceived exertion, thermal sensation, muscle soreness). Mean ball speed was higher after cooling in Session 2 (118.9 ± 8.1 vs. 115.5 ± 8.6 km · h−1; P = 0.001; d = 0.67), reducing declines in ball speed between sessions (0.24 vs. −3.18 km · h−1; P = 0.03; d = 1.80). Large effects indicated higher accuracy in Session 2 after cooling (46.0 ± 11.2 vs. 39.4 ± 8.6 arbitrary units [AU]; P = 0.13; d = 0.93) without affecting total run-up speed (19.0 ± 3.1 vs. 19.0 ± 2.5 km · h−1; P = 0.97; d = 0.01). Cooling reduced core temperature, skin temperature and thermal sensation throughout the intervention (P = 0.001–0.05; d = 1.31–5.78) and attenuated creatine kinase (P = 0.04; d = 0.56) and muscle soreness at 24-h (P = 0.03; d = 2.05). Accordingly, mixed-method cooling can reduce thermal strain after a 10-over spell and improve markers of muscular damage and discomfort alongside maintained medium-fast bowling performance on consecutive days in hot conditions.
Resumo:
Recent advancements in the capabilities of information and communication technologies (ICT) offer unique avenues to support the delivery of nutrition care. Despite ICTs being widely available, evidence on the practices and attitudes with regard to ICT use among dietitians is limited. A cross-sectional survey of Dietitians Association of Australia members was administered online in August 2011. All dietitians who responded (n=87) had access to a computer at work. Half reported providing non face-to-face consultations, with the telephone and email the most common modes of delivery. The use of smart phones was prevalent for 49% of practitioners, with 30% recommending nutrition-related applications and/or programs to clients. Benefits to technology use in practice most commonly reported included improvements in access to information/resources, time management, and workflow efficiency. Barriers identified related to cost and access to technology, and lack of suitable programs/applications. Technology was viewed as an important tool in practice among 93% of dietitians surveyed, however only 38% were satisfied with their current level of use. The majority (81%) believed more technology should be integrated within dietetics, while 85% indicated that the development of suitable and practical applications andprograms is necessary for future practice. Technology is regarded as an important tool by Australian dietitians, with an expressed need for theirinclusion to further facilitate nutrition care. Regular and ongoing evaluation of technology use among dietitians is vital to ensure thatapplications and use are evidence based and relevant to consumers in the digital world.
Resumo:
Critically ill patients receiving extracorporeal membrane oxygenation (ECMO) are often noted to have increased sedation requirements. However, data related to sedation in this complex group of patients is limited. The aim of our study was to characterise the sedation requirements in adult patients receiving ECMO for cardiorespiratory failure. A retrospective chart review was performed to collect sedation data for 30 consecutive patients who received venovenous or venoarterial ECMO between April 2009 and March 2011. To test for a difference in doses over time we used a regression model. The dose of midazolam received on ECMO support increased by an average of 18 mg per day (95% confidence interval 8, 29 mg, P=0.001), while the dose of morphine increased by 29 mg per day (95% confidence interval 4, 53 mg, P=0.021) The venovenous group received a daily midazolam dose that was 157 mg higher than the venoarterial group (95% confidence interval 53, 261 mg, P=0.005). We did not observe any significant increase in fentanyl doses over time (95% confidence interval 1269, 4337 µg, P=0.94). There is a significant increase in dose requirement for morphine and midazolam during ECMO. Patients on venovenous ECMO received higher sedative doses as compared to patients on venoarterial ECMO. Future research should focus on mechanisms behind these changes and also identify drugs that are most suitable for sedation during ECMO.
Resumo:
BACKGROUND: Given the expanding scope of extracorporeal membrane oxygenation (ECMO) and its variable impact on drug pharmacokinetics as observed in neonatal studies, it is imperative that the effects of the device on the drugs commonly prescribed in the intensive care unit (ICU) are further investigated. Currently, there are no data to confirm the appropriateness of standard drug dosing in adult patients on ECMO. Ineffective drug regimens in these critically ill patients can seriously worsen patient outcomes. This study was designed to describe the pharmacokinetics of the commonly used antibiotic, analgesic and sedative drugs in adult patients receiving ECMO. METHODS: This is a multi-centre, open-label, descriptive pharmacokinetic (PK) study. Eligible patients will be adults treated with ECMO for severe cardiac and/or respiratory failure at five Intensive Care Units in Australia and New Zealand. Patients will receive the study drugs as part of their routine management. Blood samples will be taken from indwelling catheters to investigate plasma concentrations of several antibiotics (ceftriaxone, meropenem, vancomycin, ciprofloxacin, gentamicin, piperacillin-tazobactum, ticarcillin-clavulunate, linezolid, fluconazole, voriconazole, caspofungin, oseltamivir), sedatives and analgesics (midazolam, morphine, fentanyl, propofol, dexmedetomidine, thiopentone). The PK of each drug will be characterised to determine the variability of PK in these patients and to develop dosing guidelines for prescription during ECMO. DISCUSSION: The evidence-based dosing algorithms generated from this analysis can be evaluated in later clinical studies. This knowledge is vitally important for optimising pharmacotherapy in these most severely ill patients to maximise the opportunity for therapeutic success and minimise the risk of therapeutic failure
Resumo:
Ecstasy use may result in lowered mood, anxiety or aggression in the days following use. Yet, few studies have investigated what factors increase the risk of experiencing such symptoms. Ecstasy users (at least once in the last 12 months) who subsequently took ecstasy (n=35) over the next week, were compared on measures of mood, sleep, stress and drug use, with those who abstained (n=21) that week. Measures were administered the week prior to ecstasy use and 1 and 3 days following use, or the equivalent day for abstainers. Mood symptoms were assessed using the Kessler-10 self-report psychological distress scale, a subjective mood rating (1-10), and the depression, anxiety and hostility items of the clinician-rated Brief Psychiatric Rating Scale. Timeline followback methods were used to collect information on drug use and life stress in the past month. Self-reported sleep quality was also assessed. Ecstasy use was not associated with subacute depressive, anxiety or aggressive symptoms. Rather, lowered mood and increased psychological distress were associated with self-reported hours and quality of sleep obtained during the 3-day follow up. These findings highlight the importance of considering sleep disruption in understanding the short-term mood effects of ecstasy use.
Resumo:
Objective: Substance use is common in first-episode psychosis, and complicates the accurate diagnosis and treatment of the disorder. The differentiation of substance-induced psychotic disorders (SIPD) from primary psychotic disorders (PPD) is particularly challenging. This cross-sectional study compares the clinical, substance use and functional characteristics of substance using first episode psychosis patients diagnosed with a SIPD and PPD. Method: Participants were 61 young people (15-24 years) admitted to a psychiatric inpatient service with first episode psychosis, reporting substance use in the past month. Diagnosis was determined using the Psychiatric Research Interview for DSM-IV Substance and Mental disorders (PRISM-IV). Measures of clinical (severity of psychotic symptoms, level of insight, history of trauma), substance use (frequency/quantity, severity) and social and occupational functioning were also administered. Results: The PRISM-IV differentially diagnosed 56% of first episode patients with a SIPD and 44% with a PPD. Those with a SIPD had higher rates of substance use and disorders, higher levels of insight, were more likely to have a forensic and trauma history and had more severe hostility and anxious symptoms than those with a PPD. Logistic regression analysis indicated a family history of psychosis, trauma history and current cannabis dependence were the strongest predictors of a SIPD. Almost 80% of diagnostic predictions of a SIPD were accurate using this model. Conclusions: This clinical profile of SIPD could help to facilitate the accurate diagnosis and treatment of SIPD versus PPD in young people with first episode psychosis admitted to an inpatient psychiatric service.
Resumo:
A range of risk management initiatives have been introduced in organisations in attempt to reduce occupational road incidents. However a discrepancy exists between the initiatives that are frequently implemented in organisations and the initiatives that have demonstrated scientific merit in improving occupational road safety. Given that employees’ beliefs may facilitate or act as a barrier to implementing initiatives, it is important to understand whether initiatives with scientific merit are perceived to be effective by employees. To explore employee perceptions pertaining to occupational road safety initiatives, a questionnaire was administered to 679 employees sourced from four Australian organisations. Participants ranged in age from 18 years to 65 years (M = 42, SD = 11). Participants rated 35 initiatives based on how effective they thought they would be in improving road safety in their organisation. The initiatives perceived by employees to be most effective in managing occupational road risks comprised: making vehicle safety features standard e.g. passenger airbags; practical driver skills training; and investigation of serious vehicle incidents. The initiatives perceived to be least effective in managing occupational road risks comprised: signing a promise card commitment to drive safely; advertising the organisation’s phone number on vehicles for complaints and compliments; and consideration of driving competency in staff selection process. Employee perceptions were analysed at a factor level and at an initiative level. The mean scores for the three extracted factors revealed that employees believed occupational road risks could best be managed by the employer implementing engineering and human resource methods to enhance road safety. Initiatives relating to employer management of identified risk factors were perceived to be more effective than feedback or motivational methods that required employees to accept responsibility for their driving safety. Practitioners can use the findings from this study to make informed decisions about how they select, manage and market occupational safety initiatives.
Resumo:
Purpose The purpose of this study was to investigate the nature and prevalence of discrimination against people living with HIV/AIDS in West Bengal, India, and how discrimination is associated with depression, suicidal ideation and suicidal attempts. Method Semi-structured interviews and the Beck Depression Inventory were administered to 105 HIV infected persons recruited by incidental sampling, at an Integrated Counseling and Testing Center (ICTC) and through Networks of People Living with HIV/AIDS, in the West Bengal area. Results Findings showed that 40.8% of the sample has experienced discrimination at least in one social setting – such as family (29.1%), health centers (18.4%), community (17.5%) and workplace (6.8%). About two-fifths (40.8%) reported experiencing discrimination in multiple social settings. Demographic factors associated with discrimination were gender, age, occupation, education, and current residence. More than half of the sample was suffering from severe depression while 8.7% had attempted suicide. Discrimination in most areas was significantly associated with suicidal ideation and suicidal attempts. Conclusions Prevalence of discrimination associated with HIV/AIDS is high in our sample from West Bengal. While discrimination was not associated with depressive symptomatology, discrimination was associated with suicidal ideation and attempts. These findings suggest that there is an urgent need for interventions to reduce discrimination of HIV/AIDS in the West Bengal region.
Resumo:
Environmental issues continue to capture international headlines and remain the subject of intense intellectual, political and public debate. As a result, environmental law is widely recognised as the fastest growing area of international jurisprudence. This, combined with the rapid expansion of environmental agreements and policies, has created a burgeoning landscape of administrative, regulatory and judicial regimes. Emerging from these developments are increases in environmental offences, and more recently environmental crimes. The judicial processing of environmental or ‘green’ crimes is rapidly developing across many jurisdictions. Since 1979, Australia has played a lead role in criminal justice processing of environment offences through the New South Wales Land and Environment Court (NSW LEC). This article draws on case data, observations and interviews with court personnel, to examine the ways in which environmental justice is now administered through the existing court structures, and how it has changed since the Court’s inception.