848 resultados para intensive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: While the global education debate remains focused on graduate skills and employability, the absence of a shared language between student, academic and industry stakeholder groups means that defining industry skills requirements is both essential and difficult. The aim of this study was to assess graduate skills requirements in a knowledge intensive industry from a demand perspective as distinct from a curriculum (supply) viewpoint. Design/methodology/approach: Skills items were derived from a breadth of disciplines across academic, policy and industry literature. CEOs and senior managers in the innovation and commercialisation industry were surveyed regarding perceptions of skills in graduates and skills in demand by the firm. Two rounds of exploratory factor analyses were undertaken to examine employers’ perceptions of the skills gap. Findings: First order analysis resolved 10 broad constructs that represent cognitive, interpersonal and intrapersonal skills domains as applied in this industry. Knowledge, leadership and interprofessional collaboration feature as prominent skills. Second order analysis revealed employers’ perceptions of graduate skills specifically centre on organisational fit and organisational success. An over-arching theme relates to performance of the individual in organisations. Research limitations/implications: Our findings suggest that the discourse on employability and the design of curriculum need to shift from instilling lists of skills towards enabling graduates to perform in a diversity of workplace contexts and expectations centred on organisational purpose. Originality/value: In contrast to the heterogeneous nature of industry surveys, we targeted a homogenous sector that is representative of knowledge intensive industries. This study contributes to the broader stakeholder dialogue of the value and application of graduate skills in this and other industry sectors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose A Psychiatric Intensive Care Unit (PICU) and or High Dependency Unit (HDU) is a locked, intensive treatment facility available to people experiencing acute psychiatric distress. For many people who access public mental health services in Australia, the PICU/HDU is the primary point of admission, and should represent and facilitate timely assessment and an optimum treatment plan under a recovery framework. Nurses are the largest health discipline working in this specialty area of care. The paper aims to discuss these issues. Design/methodology/approach A qualitative study aimed to investigate the skills, experience, and practice, of nurses working in the PICU/HDU in relation to a recovery model of care. Identifying how nurses provide care in the PICU/HDU will inform a clinical practice guideline to further support this specialty area of care. Four focus groups were facilitated with 52 registered nurses attending. Findings The nurse participants identified specific skills under four distinct themes; Storytelling, Treatment and recovery, Taking responsibility, and Safeguarding. The skills highlight the expertise and clinical standard required to support a recovery model of care in the PICU. Research limitations/implications – The research findings highlight urgency for a National PICU/HDU clinical practice guideline. Practical implications A PICU/HDU practice guideline will promote the standard of nursing care required in the PICU/HDU. The PICU/HDU needs to be recognised as a patient centred, therapeutic opportunity as opposed to a restrictive and custodial clinical area. Social implications Providing transparency of practice in the PICU/HDU and educating nurses to this specialty area of care will improve client outcome and recovery. Originality/value Very few studies have explored the skills, experience, and practice, of nurses working in the PICU/HDU in relation to a recovery model of care. A dearth of research exists on what is required to work in this specialty area of care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Anaemia is common in critically ill patients, and has a significant negative impact on patients' recovery. Blood conservation strategies have been developed to reduce the incidence of iatrogenic anaemic caused by sampling for diagnostic testing. Objectives Describe practice and local guidelines in adult, paediatric and neonatal Australian intensive care units (ICUs) regarding blood sampling and conservation strategies. Methods Cross-sectional descriptive study, conducted July 2013 over one week in single adult, paediatric and neonatal ICUs in Brisbane. Data were collected on diagnostic blood samples obtained during the study period, including demographic and acuity data of patients. Institutional blood conservation practice and guidelines were compared against seven evidence-based recommendations. Results A total of 940 blood sampling episodes from 96 patients were examined across three sites. Arterial blood gas was the predominant reason for blood sampling in each unit, accounting for 82% of adult, 80% of paediatric and 47% of neonatal samples taken (p <. 0.001). Adult patients had significantly more median [IQR] samples per day in comparison to paediatrics and neonates (adults 5.0 [2.4]; paediatrics 2.3 [2.9]; neonatal 0.7 [2.7]), which significantly increased median [IQR] blood sampling costs per day (adults AUD$101.11 [54.71]; paediatrics AUD$41.55 [56.74]; neonatal AUD$8.13 [14.95]; p <. 0.001). The total volume of samples per day (median [IQR]) was also highest in adults (adults 22.3. mL [16.8]; paediatrics 5.0. mL [1.0]; neonates 0.16. mL [0.4]). There was little information about blood conservation strategies in the local clinical practice guidelines, with the adult and neonatal sites including none of the seven recommendations. Conclusions There was significant variation in blood sampling practice and conservation strategies between critical care settings. This has implications not only for anaemia but also infection control and healthcare costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims & Objectives - identify and diagnose the current problems associated with patient care with regard to the nursing management of patients with Sengstaken-Blakemore tubes insitu; - Identify current nursing practice currently in place within the ICU and the hospital; identify the method by which the assessment and provision of nursing care is delivered in the ICU

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intensive nursery systems are designed to culture mud crab postlarvae through a critical phase in preparation for stocking into growout systems. This study investigated the influence of stocking density and provision of artificial habitat on the yield of a cage culture system. For each of three batches of postlarvae, survival, growth and claw loss were assessed after each of three nursery phases ending at crab instars C1/C2, C4/C5 and C7/C8. Survival through the first phase was highly variable among batches with a maximum survival of 80% from megalops to a mean crab instar of 1.5. Stocking density between 625 and 2300 m-2 did not influence survival or growth in this first phase. Stocking densities tested in phases 2 and 3 were 62.5, 125 and 250 m -2. At the end of phases 2 and 3, there were five instar stages present, representing a more than 20-fold size disparity within the populations. Survival became increasingly density-sensitive following the first phase, with higher densities resulting in significantly lower survival (phase 2: 63% vs. 79%; phase 3: 57% vs. 64%). The addition of artificial habitat in the form of pleated netting significantly improved survival at all densities. The mean instar attained by the end of phase 2 was significantly larger at a lower stocking density and without artificial habitat. No significant effect of density or habitat on harvest size was detected in phase 3. The highest incidence of claw loss was 36% but was reduced by lowering stocking densities and addition of habitat. For intensive commercial production, yield can be significantly increased by addition of a simple net structure but rapidly decreases the longer crablets remain in the nursery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to monitor ovarian hormone function response to intense exercise and body weight changes in female athletes. Ovarian hormone function was evaluated in 12 female lightweight rowers and 10 age-height-weight matched sedentary controls. Ovarian hormone function was assessed during consecutive competition season and off season, by measurement of peak and average alternative day overnight urinary oestrone glucuronide (E1G) and pregnanediol glucuronide (PdG) excretion. Competition season was associated with a 5.8 kg (9.3%) body weight loss in the lightweight rowers. Significantly lower competition season peak and average urinary excretion of PdG were found in the lightweight rowers compared with the controls. Lower competition season peak and average urinary excretion of E1G were also found in the lightweight rowers compared with the controls, but the difference did not reach significance. The number of rowing training hours was a significant determinant of peak PdG excretion in the rowers (R2 = 0.40; p<0.02). The seasonal suppression of PdG excretion was associated with degree of weight loss (R2 = 0.46; p<0.01). The competition related decrease in E1G and PdG excretion for the lightweight rowers was predominantly restored during the off season when exercise intensity and duration were decreased and body weight increased. These results showed a significant (p<0.05) reduction in progesterone metabolite excretion and a non-significant decrease in oestrone metabolite excretion associated with intensive competition season training loads and body weight reduction in female lightweight rowers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Manure management emissions may present much greater opportunity for greenhouse gas mitigation in the feedlot, pig, chicken meat, egg and diary industries, than the current IPCC and DCC calculation guidelines suggest. Current literature and understanding of manure mass loss throughout the manure management system does not support these current guidelines; in which the emission rates are fixed and consequently don't allow incentives for reduced emissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To adapt to climate variability and a lack of irrigation water, businesses and growers in southern Australia, northern New South Wales and southern Queensland are, or are considering, migrating their businesses to northern Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Development of new agricultural industries in northern Australia is seen as a way to provide food security in the face of reduced water availability in existing regions in the south. This report aims to identify some of the possible economic consequences of developing a rice industry in the Burdekin region, while there is a reduction of output in the Riverina. Annual rice production in the Riverina peaked at 1.7 M tonnes, but the long-term outlook, given climate change impacts on that region and government water buy-backs, is more likely to be less than 800,000 tonnes. Growers are highly efficient water users by international standards, but the ability to offset an anticipated reduction in water availability through further efficiency gains is limited. In recent years growers in the Riverina have diversified their farms to a greater extent and secondary production systems include beef, sheep and wheat. Production in north Queensland is in its infancy, but a potentially suitable farming system has been developed by including rice within the sugarcane system without competition and in fact contributing to the production of sugar by increasing yields and controlling weeds. The economic outcomes are estimated a large scale, dynamic, computable general equilibrium (CGE) model of the world economy (Tasman Global), scaled down to regional level. CGE models mimic the workings of the economy through a system of interdependent behavioural and accounting equations which are linked to an input-output database. When an economic shock or change is applied to a model, each of the markets adjusts according to the set of behavioural parameters which are underpinned by economic theory. In this study the model is driven by reducing production in the Riverina in accordance with relationships found between water availability and the production of rice and replacement by other crops and by increasing ride production in the Burdekin. Three scenarios were considered: • Scenario 1: Rice is grown using the fallow period between the last ratoon crop of sugarcane and the new planting. In this scenario there is no competition between rice and sugarcane • Scenario 2: Rice displaces sugarcane production • Scenario 3: Rice is grown on additional land and does not compete with sugarcane. Two time periods were used, 2030 and 2070, which are the conventional time points to consider climate change impacts. Under scenario 1, real economic output declines in the Riverina by $45 million in 2030 and by $139 million in 2070. This is only partially offset by the increased real economic output in the Burdekin of $35 million and $131 million respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Development of new agricultural industries in northern Australia is often perceived as a solution to changes in water availability that have occurred within southern Australia as a result of changes to government policy in response to and exacerbated by climate change. This report examines the likely private, social and community costs and benefits associated with the establishment of a cotton industry in the Burdekin. The research undertaken covers three spatial scales by modelling the response of cotton and to climate change at the crop and farm scale and linking this to regional scale modelling of the economy. Modelling crop growth as either a standalone crop or as part of a farm enterprise provides the clearest picture of how yields and water use will be affected under climate change. The alternative to this is to undertake very costly trials in environmental chambers. For this reason it is critical that funding for model development especially for crops being crop in novel environments be seen as a high priority for climate change and adaptation studies. Crop level simulations not only provide information on how the crop responds to climate change, they also illustrate that that these responses are the result of complex interactions and cannot necessarily be derived from the climate information alone. These simulations showed that climate change would lead to decreased cotton yields in 2030 and 2050 without the affect of CO2 fertilisation. Without CO2 fertilisation, yields would be decreased by 3.2% and 17.8%. Including CO2 fertilisation increased yields initially by 5.9%, but these were reduced by 3.6% in 2050. This still represents a major offset and at least ameliorates the impact of climate change on yield. To cope with the decreased in-crop rainfall (4.5% by 2030 and 15.8% in 2050) and an initial increase in evapotranspiration of 2% in 2030 and

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vegetable cropping systems are often characterised by high inputs of nitrogen fertiliser. Elevated emissions of nitrous oxide (N2O) can be expected as a consequence. In order to mitigate N2O emissions from fertilised agricultural fields, the use of nitrification inhibitors, in combination with ammonium based fertilisers, has been promoted. However, no data is currently available on the use of nitrification inhibitors in sub-tropical vegetable systems. A field experiment was conducted to investigate the effect of the nitrification inhibitor 3,4-dimethylpyrazole phosphate (DMPP) on N2O emissions and yield from broccoli production in sub-tropical Australia. Soil N2O fluxes were monitored continuously (3 h sampling frequency) with fully automated, pneumatically operated measuring chambers linked to a sampling control system and a gas chromatograph. Cumulative N2O emissions over the 5 month observation period amounted to 298 g-N/ha, 324 g-N/ha, 411 g-N/ha and 463 g-N/ha in the conventional fertiliser (CONV), the DMPP treatment (DMPP), the DMMP treatment with a 10% reduced fertiliser rate (DMPP-red) and the zero fertiliser (0N), respectively. The temporal variation of N2O fluxes showed only low emissions over the broccoli cropping phase, but significantly elevated emissions were observed in all treatments following broccoli residues being incorporated into the soil. Overall 70–90% of the total emissions occurred in this 5 weeks fallow phase. There was a significant inhibition effect of DMPP on N2O emissions and soil mineral N content over the broccoli cropping phase where the application of DMPP reduced N2O emissions by 75% compared to the standard practice. However, there was no statistical difference between the treatments during the fallow phase or when the whole season was considered. This study shows that DMPP has the potential to reduce N2O emissions from intensive vegetable systems, but also highlights the importance of post-harvest emissions from incorporated vegetable residues. N2O mitigation strategies in vegetable systems need to target these post-harvest emissions and a better evaluation of the effect of nitrification inhibitors over the fallow phase is needed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intensive pig and poultry farming in Australia can be a source of pathogens with implications for food-safety and/or human illness. Seven studies were undertaken with the following objectives: · Assess the types of zoonotic pathogens in waste · Assess the transfer of pathogens during re-use both within the shed and externally in the environment · The potential for movement of pathogens via aerosols In the first and second studies the extent of zoonotic pathogens was evaluated in both piggery effluent and chicken litter and Salmonella and Campylobacter were detected in both wastes. In the third study the dynamics of Salmonella during litter re-use was examined and results showed a trend for lower Salmonella levels and serovar diversity in re-used litter compared to new litter. Thus, re-use within the poultry farming system posed no increased risk. The fourth study addressed the direct risks of pathogens to farm workers due to reuse of piggery effluent within the pig shed. Based on air-borne Escherichia coli (E. coli) levels, re-using effluent did not pose a risk. In the fifth study high levels of Arcobacter spp. were detected in effluent ponds and freshly irrigated soils with potential food-safety risks during the irrigation of food-crops and pasture. The sixth and seventh studies addressed the risks from aerosols from mechanically ventilated sheds. Staphylococci were shown to have potential as markers, with airborne levels gradually dropping and reaching background levels at 400 m distance. Salmonella was detected (at low levels) both inside and outside the shed (at 10 m). Campylobacter was detected only once inside the shed during the 3-year period (at low levels). Results showed there was minimal risk to humans living adjacent to poultry farms This is the first comprehensive analysis studying key food-safety pathogens and potential public health risks associated with intensively farmed pigs and poultry in Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The estimated likelihood of lower limb amputation is 10 to 30 times higher amongst people with diabetes compared to those without diabetes. Of all non-traumatic amputations in people with diabetes, 85% are preceded by a foot ulcer. Foot ulceration associated with diabetes (diabetic foot ulcers) is caused by the interplay of several factors, most notably diabetic peripheral neuropathy (DPN), peripheral arterial disease (PAD) and changes in foot structure. These factors have been linked to chronic hyperglycaemia (high levels of glucose in the blood) and the altered metabolic state of diabetes. Control of hyperglycaemia may be important in the healing of ulcers. Objectives To assess the effects of intensive glycaemic control compared to conventional control on the outcome of foot ulcers in people with type 1 and type 2 diabetes. Search methods In December 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE; EBSCO CINAHL; Elsevier SCOPUS; ISI Web of Knowledge Web of Science; BioMed Central and LILACS. We also searched clinical trial databases, pharmaceutical trial databases and current international and national clinical guidelines on diabetes foot management for relevant published, non-published, ongoing and terminated clinical trials. There were no restrictions based on language or date of publication or study setting. Selection criteria Published, unpublished and ongoing randomised controlled trials (RCTs) were considered for inclusion where they investigated the effects of intensive glycaemic control on the outcome of active foot ulcers in people with diabetes. Non randomised and quasi-randomised trials were excluded. In order to be included the trial had to have: 1) attempted to maintain or control blood glucose levels and measured changes in markers of glycaemic control (HbA1c or fasting, random, mean, home capillary or urine glucose), and 2) documented the effect of these interventions on active foot ulcer outcomes. Glycaemic interventions included subcutaneous insulin administration, continuous insulin infusion, oral anti-diabetes agents, lifestyle interventions or a combination of these interventions. The definition of the interventional (intensive) group was that it should have a lower glycaemic target than the comparison (conventional) group. Data collection and analysis All review authors independently evaluated the papers identified by the search strategy against the inclusion criteria. Two review authors then independently reviewed all potential full-text articles and trials registry results for inclusion. Main results We only identified one trial that met the inclusion criteria but this trial did not have any results so we could not perform the planned subgroup and sensitivity analyses in the absence of data. Two ongoing trials were identified which may provide data for analyses in a later version of this review. The completion date of these trials is currently unknown. Authors' conclusions The current review failed to find any completed randomised clinical trials with results. Therefore we are unable to conclude whether intensive glycaemic control when compared to conventional glycaemic control has a positive or detrimental effect on the treatment of foot ulcers in people with diabetes. Previous evidence has however highlighted a reduction in risk of limb amputation (from various causes) in people with type 2 diabetes with intensive glycaemic control. Whether this applies to people with foot ulcers in particular is unknown. The exact role that intensive glycaemic control has in treating foot ulcers in multidisciplinary care (alongside other interventions targeted at treating foot ulcers) requires further investigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The delivery of products and services for construction-based businesses is increasingly becoming knowledge-driven and information-intensive. The proliferation of building information modelling (BIM) has increased business opportunities as well as introduced new challenges for the architectural, engineering and construction and facilities management (AEC/FM) industry. As such, the effective use, sharing and exchange of building life cycle information and knowledge management in building design, construction, maintenance and operation assumes a position of paramount importance. This paper identifies a subset of construction management (CM) relevant knowledge for different design conditions of building components through a critical, comprehensive review of synthesized literature and other information gathering and knowledge acquisition techniques. It then explores how such domain knowledge can be formalized as ontologies and, subsequently, a query vocabulary in order to equip BIM users with the capacity to query digital models of a building for the retrieval of useful and relevant domain-specific information. The formalized construction knowledge is validated through interviews with domain experts in relation to four case study projects. Additionally, retrospective analyses of several design conditions are used to demonstrate the soundness (realism), completeness, and appeal of the knowledge base and query-based reasoning approach in relation to the state-of-the-art tools, Solibri Model Checker and Navisworks. The knowledge engineering process and the methods applied in this research for information representation and retrieval could provide useful mechanisms to leverage BIM in support of a number of knowledge intensive CM/FM tasks and functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The adequacy of anesthesia has been studied since the introduction of balanced general anesthesia. Commercial monitors based on electroencephalographic (EEG) signal analysis have been available for monitoring the hypnotic component of anesthesia from the beginning of the 1990s. Monitors measuring the depth of anesthesia assess the cortical function of the brain, and have gained acceptance during surgical anesthesia with most of the anesthetic agents used. However, due to frequent artifacts, they are considered unsuitable for monitoring consciousness in intensive care patients. The assessment of analgesia is one of the cornerstones of general anesthesia. Prolonged surgical stress may lead to increased morbidity and delayed postoperative recovery. However, no validated monitoring method is currently available for evaluating analgesia during general anesthesia. Awareness during anesthesia is caused by an inadequate level of hypnosis. This rare but severe complication of general anesthesia may lead to marked emotional stress and possibly posttraumatic stress disorder. In the present series of studies, the incidence of awareness and recall during outpatient anesthesia was evaluated and compared with that of in inpatient anesthesia. A total of 1500 outpatients and 2343 inpatients underwent a structured interview. Clear intraoperative recollections were rare the incidence being 0.07% in outpatients and 0.13% in inpatients. No significant differences emerged between outpatients and inpatients. However, significantly smaller doses of sevoflurane were administered to outpatients with awareness than those without recollections (p<0.05). EEG artifacts in 16 brain-dead organ donors were evaluated during organ harvest surgery in a prospective, open, nonselective study. The source of the frontotemporal biosignals in brain-dead subjects was studied, and the resistance of bispectral index (BIS) and Entropy to the signal artifacts was compared. The hypothesis was that in brain-dead subjects, most of the biosignals recorded from the forehead would consist of artifacts. The original EEG was recorded and State Entropy (SE), Response Entropy (RE), and BIS were calculated and monitored during solid organ harvest. SE differed from zero (inactive EEG) in 28%, RE in 29%, and BIS in 68% of the total recording time (p<0.0001 for all). The median values during the operation were SE 0.0, RE 0.0, and BIS 3.0. In four of the 16 organ donors, EEG was not inactive, and unphysiologically distributed, nonreactive rhythmic theta activity was present in the original EEG signal. After the results from subjects with persistent residual EEG activity were excluded, SE, RE, and BIS differed from zero in 17%, 18%, and 62% of the recorded time, respectively (p<0.0001 for all). Due to various artifacts, the highest readings in all indices were recorded without neuromuscular blockade. The main sources of artifacts were electrocauterization, electromyography (EMG), 50-Hz artifact, handling of the donor, ballistocardiography, and electrocardiography. In a prospective, randomized study of 26 patients, the ability of Surgical Stress Index (SSI) to differentiate patients with two clinically different analgesic levels during shoulder surgery was evaluated. SSI values were lower in patients with an interscalene brachial plexus block than in patients without an additional plexus block. In all patients, anesthesia was maintained with desflurane, the concentration of which was targeted to maintain SE at 50. Increased blood pressure or heart rate (HR), movement, and coughing were considered signs of intraoperative nociception and treated with alfentanil. Photoplethysmographic waveforms were collected from the contralateral arm to the operated side, and SSI was calculated offline. Two minutes after skin incision, SSI was not increased in the brachial plexus block group and was lower (38 ± 13) than in the control group (58 ± 13, p<0.005). Among the controls, one minute prior to alfentanil administration, SSI value was higher than during periods of adequate antinociception, 59 ± 11 vs. 39 ± 12 (p<0.01). The total cumulative need for alfentanil was higher in controls (2.7 ± 1.2 mg) than in the brachial plexus block group (1.6 ± 0.5 mg, p=0.008). Tetanic stimulation to the ulnar region of the hand increased SSI significantly only among patients with a brachial plexus block not covering the site of stimulation. Prognostic value of EEG-derived indices was evaluated and compared with Transcranial Doppler Ultrasonography (TCD), serum neuron-specific enolase (NSE) and S-100B after cardiac arrest. Thirty patients resuscitated from out-of-hospital arrest and treated with induced mild hypothermia for 24 h were included. Original EEG signal was recorded, and burst suppression ratio (BSR), RE, SE, and wavelet subband entropy (WSE) were calculated. Neurological outcome during the six-month period after arrest was assessed with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Twenty patients had a CPC of 1-2, one patient had a CPC of 3, and nine patients died (CPC 5). BSR, RE, and SE differed between good (CPC 1-2) and poor (CPC 3-5) outcome groups (p=0.011, p=0.011, p=0.008, respectively) during the first 24 h after arrest. WSE was borderline higher in the good outcome group between 24 and 48 h after arrest (p=0.050). All patients with status epilepticus died, and their WSE values were lower (p=0.022). S-100B was lower in the good outcome group upon arrival at the intensive care unit (p=0.010). After hypothermia treatment, NSE and S-100B values were lower (p=0.002 for both) in the good outcome group. The pulsatile index was also lower in the good outcome group (p=0.004). In conclusion, the incidence of awareness in outpatient anesthesia did not differ from that in inpatient anesthesia. Outpatients are not at increased risk for intraoperative awareness relative to inpatients undergoing general anesthesia. SE, RE, and BIS showed non-zero values that normally indicate cortical neuronal function, but were in these subjects mostly due to artifacts after clinical brain death diagnosis. Entropy was more resistant to artifacts than BIS. During general anesthesia and surgery, SSI values were lower in patients with interscalene brachial plexus block covering the sites of nociceptive stimuli. In detecting nociceptive stimuli, SSI performed better than HR, blood pressure, or RE. BSR, RE, and SE differed between the good and poor neurological outcome groups during the first 24 h after cardiac arrest, and they may be an aid in differentiating patients with good neurological outcomes from those with poor outcomes after out-of-hospital cardiac arrest.