971 resultados para Death rate
Resumo:
The adequacy of anesthesia has been studied since the introduction of balanced general anesthesia. Commercial monitors based on electroencephalographic (EEG) signal analysis have been available for monitoring the hypnotic component of anesthesia from the beginning of the 1990s. Monitors measuring the depth of anesthesia assess the cortical function of the brain, and have gained acceptance during surgical anesthesia with most of the anesthetic agents used. However, due to frequent artifacts, they are considered unsuitable for monitoring consciousness in intensive care patients. The assessment of analgesia is one of the cornerstones of general anesthesia. Prolonged surgical stress may lead to increased morbidity and delayed postoperative recovery. However, no validated monitoring method is currently available for evaluating analgesia during general anesthesia. Awareness during anesthesia is caused by an inadequate level of hypnosis. This rare but severe complication of general anesthesia may lead to marked emotional stress and possibly posttraumatic stress disorder. In the present series of studies, the incidence of awareness and recall during outpatient anesthesia was evaluated and compared with that of in inpatient anesthesia. A total of 1500 outpatients and 2343 inpatients underwent a structured interview. Clear intraoperative recollections were rare the incidence being 0.07% in outpatients and 0.13% in inpatients. No significant differences emerged between outpatients and inpatients. However, significantly smaller doses of sevoflurane were administered to outpatients with awareness than those without recollections (p<0.05). EEG artifacts in 16 brain-dead organ donors were evaluated during organ harvest surgery in a prospective, open, nonselective study. The source of the frontotemporal biosignals in brain-dead subjects was studied, and the resistance of bispectral index (BIS) and Entropy to the signal artifacts was compared. The hypothesis was that in brain-dead subjects, most of the biosignals recorded from the forehead would consist of artifacts. The original EEG was recorded and State Entropy (SE), Response Entropy (RE), and BIS were calculated and monitored during solid organ harvest. SE differed from zero (inactive EEG) in 28%, RE in 29%, and BIS in 68% of the total recording time (p<0.0001 for all). The median values during the operation were SE 0.0, RE 0.0, and BIS 3.0. In four of the 16 organ donors, EEG was not inactive, and unphysiologically distributed, nonreactive rhythmic theta activity was present in the original EEG signal. After the results from subjects with persistent residual EEG activity were excluded, SE, RE, and BIS differed from zero in 17%, 18%, and 62% of the recorded time, respectively (p<0.0001 for all). Due to various artifacts, the highest readings in all indices were recorded without neuromuscular blockade. The main sources of artifacts were electrocauterization, electromyography (EMG), 50-Hz artifact, handling of the donor, ballistocardiography, and electrocardiography. In a prospective, randomized study of 26 patients, the ability of Surgical Stress Index (SSI) to differentiate patients with two clinically different analgesic levels during shoulder surgery was evaluated. SSI values were lower in patients with an interscalene brachial plexus block than in patients without an additional plexus block. In all patients, anesthesia was maintained with desflurane, the concentration of which was targeted to maintain SE at 50. Increased blood pressure or heart rate (HR), movement, and coughing were considered signs of intraoperative nociception and treated with alfentanil. Photoplethysmographic waveforms were collected from the contralateral arm to the operated side, and SSI was calculated offline. Two minutes after skin incision, SSI was not increased in the brachial plexus block group and was lower (38 ± 13) than in the control group (58 ± 13, p<0.005). Among the controls, one minute prior to alfentanil administration, SSI value was higher than during periods of adequate antinociception, 59 ± 11 vs. 39 ± 12 (p<0.01). The total cumulative need for alfentanil was higher in controls (2.7 ± 1.2 mg) than in the brachial plexus block group (1.6 ± 0.5 mg, p=0.008). Tetanic stimulation to the ulnar region of the hand increased SSI significantly only among patients with a brachial plexus block not covering the site of stimulation. Prognostic value of EEG-derived indices was evaluated and compared with Transcranial Doppler Ultrasonography (TCD), serum neuron-specific enolase (NSE) and S-100B after cardiac arrest. Thirty patients resuscitated from out-of-hospital arrest and treated with induced mild hypothermia for 24 h were included. Original EEG signal was recorded, and burst suppression ratio (BSR), RE, SE, and wavelet subband entropy (WSE) were calculated. Neurological outcome during the six-month period after arrest was assessed with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Twenty patients had a CPC of 1-2, one patient had a CPC of 3, and nine patients died (CPC 5). BSR, RE, and SE differed between good (CPC 1-2) and poor (CPC 3-5) outcome groups (p=0.011, p=0.011, p=0.008, respectively) during the first 24 h after arrest. WSE was borderline higher in the good outcome group between 24 and 48 h after arrest (p=0.050). All patients with status epilepticus died, and their WSE values were lower (p=0.022). S-100B was lower in the good outcome group upon arrival at the intensive care unit (p=0.010). After hypothermia treatment, NSE and S-100B values were lower (p=0.002 for both) in the good outcome group. The pulsatile index was also lower in the good outcome group (p=0.004). In conclusion, the incidence of awareness in outpatient anesthesia did not differ from that in inpatient anesthesia. Outpatients are not at increased risk for intraoperative awareness relative to inpatients undergoing general anesthesia. SE, RE, and BIS showed non-zero values that normally indicate cortical neuronal function, but were in these subjects mostly due to artifacts after clinical brain death diagnosis. Entropy was more resistant to artifacts than BIS. During general anesthesia and surgery, SSI values were lower in patients with interscalene brachial plexus block covering the sites of nociceptive stimuli. In detecting nociceptive stimuli, SSI performed better than HR, blood pressure, or RE. BSR, RE, and SE differed between the good and poor neurological outcome groups during the first 24 h after cardiac arrest, and they may be an aid in differentiating patients with good neurological outcomes from those with poor outcomes after out-of-hospital cardiac arrest.
Resumo:
Objective: We aimed to assess the impact of task demands and individual characteristics on threat detection in baggage screeners. Background: Airport security staff work under time constraints to ensure optimal threat detection. Understanding the impact of individual characteristics and task demands on performance is vital to ensure accurate threat detection. Method: We examined threat detection in baggage screeners as a function of event rate (i.e., number of bags per minute) and time on task across 4 months. We measured performance in terms of the accuracy of detection of Fictitious Threat Items (FTIs) randomly superimposed on X-ray images of real passenger bags. Results: Analyses of the percentage of correct FTI identifications (hits) show that longer shifts with high baggage throughput result in worse threat detection. Importantly, these significant performance decrements emerge within the first 10 min of these busy screening shifts only. Conclusion: Longer shift lengths, especially when combined with high baggage throughput, increase the likelihood that threats go undetected. Application: Shorter shift rotations, although perhaps difficult to implement during busy screening periods, would ensure more consistently high vigilance in baggage screeners and, therefore, optimal threat detection and passenger safety.
Resumo:
Introduction In 2008, the Federal Drug Administration (FDA) required all new glucose-lowering therapies to show cardiovascular safety, and this applies to the dipeptidyl peptidase (DPP)-4 inhibitors (‘gliptins’). At present, there is contradictory evidence on whether the gliptins increase hospitalizations for heart failure. Areas covered This is an evaluation of the Trial Evaluating Cardiovascular Outcomes with Sitagliptin (TECOS) in high risk cardiovascular subjects with type 2 diabetes [1]. TECOS demonstrated non-inferiority for sitagliptin over placebo for the primary outcome, which was cardiovascular death, nonfatal myocardial infarction, nonfatal stroke, or hospitalization for unstable angina. There was no difference in the rate of hospitalization for heart failure between sitagliptin and placebo. Expert Opinion Despite the results of TECOS, debate over the effects of sitagliptin on the rates of hospitalizations for heart failure continues with some recent studies suggesting increased rates. Recently, empagliflozin (an inhibitor of sodium-glucose cotransporter 2) has been shown to reduce cardiovascular outcomes in subjects with type 2 diabetes, including the rates of hospitalization for heart failure. In our opinion, these positive findings with empagliflozin suggest that it should be prescribed in preference to the gliptins, including sitagliptin, unless any positive cardiovascular outcomes are reported for the gliptins.
Resumo:
Background. Kidney transplantation (KTX) is considered to be the best treatment of terminal uremia. Despite improvements in short-term graft survival, a considerable number of kidney allografts are lost due to the premature death of patients with a functional kidney and to chronic allograft nephropathy (CAN). Aim. To investigate the risk factors involved in the progression of CAN and to analyze diagnostic methods for this entity. Materials and methods. Altogether, 153 implant and 364 protocol biopsies obtained between June 1996 and April 2008 were analyzed. The biopsies were classified according to Banff ’97 and chronic allograft damage index (CADI). Immunohistochemistry for TGF-β1 was performed in 49 biopsies. Kidney function was evaluated by creatinine and/or cystatin C measurement and by various estimates of glomerular filtration rate (GFR). Demographic data of the donors and recipients were recorded after 2 years’ follow-up. Results. Most of the 3-month biopsies (73%) were nearly normal. The mean CADI score in the 6-month biopsies decreased significantly after 2001. Diastolic hypertension correlated with ΔCADI. Serum creatinine concentration at hospital discharge and glomerulosclerosis were risk factors for ΔCADI. High total and LDL cholesterol, low HDL and hypertension correlated with chronic histological changes. The mean age of the donors increased from 41 -52 years. Older donors were more often women who had died from an underlying disease. The prevalence of delayed graft function increased over the years, while acute rejections (AR) decreased significantly over the years. Sub-clinical AR was observed in 4% and it did not affect long-term allograft function or CADI. Recipients´ drug treatment was modified along the Studies, being mycophenolate mophetil, tacrolimus, statins and blockers of the renine-angiotensin-system more frequently prescribed after 2001. Patients with a higher ΔCADI had lower GFR during follow-up. CADI over 2 was best predicted by creatinine, although with modest sensitivity and specificity. Neither cystatin C nor other estimates of GFR were superior to creatinine for CADI prediction. Cyclosporine A toxicity was seldom seen. Low cyclosporin A concentration after 2 h correlated with TGF- β1 expression in interstitial inflammatory cells, and this predicted worse graft function. Conclusions. The progression of CAN has been affected by two major factors: the donors’ characteristics and the recipients’ hypertension. The increased prevalence of DGF might be a consequence of the acceptance of older donors who had died from an underlying disease. Implant biopsies proved to be of prognostic value, and they are essential for comparison with subsequent biopsies. The progression of histological damage was associated with hypertension and dyslipidemia. The augmented expression of TGF-β1 in inflammatory cells is unclear, but it may be related to low immunosuppression. Serum creatinine is the most suitable tool for monitoring kidney allograft function on every-day basis. However, protocol biopsies at 6 and 12 months predicted late kidney allograft dysfunction and affected the clinical management of the patients. Protocol biopsies are thus a suitable surrogate to be used in clinical trials and for monitoring kidney allografts.
Resumo:
Congenital long QT syndrome (LQTS) with an estimated prevalence of 1:2000-1:10 000 manifests with prolonged QT interval on electrocardiogram and risk for ventricular arrhythmias and sudden death. Several ion channel genes and hundreds of mutations in these genes have been identified to underlie the disorder. In Finland, four LQTS founder mutations of potassium channel genes account for up to 40-70% of genetic spectrum of LQTS. Acquired LQTS has similar clinical manifestations, but often arises from usage of QT-prolonging medication or electrolyte disturbances. A prolonged QT interval is associated with increased morbidity and mortality not only in clinical LQTS but also in patients with ischemic heart disease and in the general population. The principal aim of this study was to estimate the actual prevalence of LQTS founder mutations in Finland and to calculate their effect on QT interval in the Finnish background population. Using a large population-based sample of over 6000 Finnish individuals from the Health 2000 Survey, we identified LQTS founder mutations KCNQ1 G589D (n=8), KCNQ1 IVS7-2A>G (n=1), KCNH2 L552S (n=2), and KCNH2 R176W (n=16) in 27 study participants. This resulted in a weighted prevalence estimate of 0.4% for LQTS in Finland. Using a linear regression model, the founder mutations resulted in a 22- to 50-ms prolongation of the age-, sex-, and heart rate-adjusted QT interval. Collectively, these data suggest that one of 250 individuals in Finland may be genetically predisposed to ventricular arrhythmias arising from the four LQTS founder mutations. A KCNE1 D85N minor allele with a frequency of 1.4% was associated with a 10-ms prolongation in adjusted QT interval and could thus identify individuals at increased risk of ventricular arrhythmias at the population level. In addition, the previously reported associations of KCNH2 K897T, KCNH2 rs3807375, and NOS1AP rs2880058 with QT interval duration were confirmed in the present study. In a separate study, LQTS founder mutations were identified in a subgroup of acquired LQTS, providing further evidence that congenital LQTS gene mutations may underlie acquired LQTS. Catecholaminergic polymorphic ventricular tachycardia (CPVT) is characterized by exercise-induced ventricular arrhythmias in a structurally normal heart and results from defects in the cardiac Ca2+ signaling proteins, mainly ryanodine receptor type 2 (RyR2). In a patient population of typical CPVT, RyR2 mutations were identifiable in 25% (4/16) of patients, implying that noncoding variants or other genes are involved in CPVT pathogenesis. A 1.1 kb RyR2 exon 3 deletion was identified in two patients independently, suggesting that this region may provide a new target for RyR2-related molecular genetic studies. Two novel RyR2 mutations showing a gain-of-function defect in vitro were identified in three victims of sudden cardiac death. Extended pedigree analyses revealed some surviving mutation carriers with mild structural abnormalities of the heart and resting ventricular arrhythmias suggesting that not all RyR2 mutations lead to a typical CPVT phenotype, underscoring the relevance of tailored risk stratification of a RyR2 mutation carrier.
Resumo:
Variability in rainfall is known to be a major influence on the dynamics of tropical forests, especially rates and patterns of tree mortality. In tropical dry forests a number of contributing factors to tree mortality, including dry season fire and herbivory by large herbivorous mammals, could be related to rainfall patterns, while loss of water potential in trees during the dry season or a wet season drought could also result in enhanced rates of death. While tree mortality as influenced by severe drought has been examined in tropical wet forests there is insufficient understanding of this process in tropical dry forests. We examined these causal factors in relation to inter-annual differences in rainfall in causing tree mortality within a 50-ha Forest Dynamics Plot located in the tropical dry deciduous forests of Mudumalai, southern India, that has been monitored annually since 1988. Over a 19-year period (1988-2007) mean annual mortality rate of all stems >1 cm dbh was 6.9 +/- 4.6% (range = 1.5-17.5%); mortality rates broadly declined from the smaller to the larger size classes with the rates in stems >30 cm dbh being among the lowest recorded in tropical forest globally. Fire was the main agent of mortality in stems 1-5 cm dbh, elephant-herbivory in stems 5-10 cm dbh, and other natural causes in stems > 10 cm dbh. Elephant-related mortality did not show any relationship to rainfall. On the other hand, fire-related mortality was significantly negatively correlated to quantity of rainfall during the preceding year. Mortality due to other causes in the larger stem sizes was significantly negatively correlated to rainfall with a 2-3-year lag, suggesting that water deficit from mild or prolonged drought enhanced the risk of death but only with a time lag that was greater than similar lags in tree mortality observed in other forest types. In this respect, tropical dry forests growing in regions of high rainfall variability may have evolved greater resistance to rainfall deficit as compared to tropical moist or temperate forests but are still vulnerable to drought-related mortality.
Resumo:
Introduction Patients post sepsis syndromes have a poor quality of life and a high rate of recurring illness or mortality. Follow-up clinics have been instituted for patients postgeneral intensive care but evidence is sparse, and there has been no clinic specifically for survivors of sepsis. The aim of this trial is to investigate if targeted screening and appropriate intervention to these patients can result in an improved quality of life (Short Form 36 health survey (SF36V.2)), decreased mortality in the first 12 months, decreased readmission to hospital and/or decreased use of health resources. Methods and analysis 204 patients postsepsis syndromes will be randomised to one of the two groups. The intervention group will attend an outpatient clinic two monthly for 6 months and receive screening and targeted intervention. The usual care group will remain under the care of their physician. To analyse the results, a baseline comparison will be carried out between each group. Generalised estimating equations will compare the SF36 domain scores between groups and across time points. Mortality will be compared between groups using a Cox proportional hazards (time until death) analysis. Time to first readmission will be compared between groups by a survival analysis. Healthcare costs will be compared between groups using a generalised linear model. Economic (health resource) evaluation will be a within-trial incremental cost utility analysis with a societal perspective. Ethics and dissemination Ethical approval has been granted by the Royal Brisbane and Women’s Hospital Human Research Ethics Committee (HREC; HREC/13/QRBW/17), The University of Queensland HREC (2013000543), Griffith University (RHS/08/14/HREC) and the Australian Government Department of Health (26/2013). The results of this study will be submitted to peer-reviewed intensive care journals and presented at national and international intensive care and/or rehabilitation conferences.
Resumo:
Soft tissue sarcomas (STS) are rare tumors of soft tissue occurring most frequently in the extremities. Modern treatment of extremity STS is based on limb-sparing surgery combined with radiotherapy. To prevent local recurrence, a healthy tissue margin of 2.5 cm around the resected tumor is required. This results in large defects of soft tissue and bone, necessitating the use of reconstructive surgery to achieve wound closure. When local or pedicled soft tissue flaps are unavailable, reconstruction with free flaps is used. Free flaps are elevated at a distant site, and have their blood flow restored at the recipient site through microvascular anastomosis. When limb-sparing surgery is made impossible, amputation is the only option. Proximal amputation such as forequarter amputation (FQA) causes considerable morbidity, but is nevertheless warranted for carefully selected patients for cure or palliation. 116 patients treated in 1985 - 2006 were included in the study. Of these, 93 patients treated with limb-sparing surgery and microvascular reconstructive surgery after resection of extremity STS. 25 patients who underwent FQA were also included. Patients were identified and their medical records retrospectively reviewed. In all, 105 free flap procedures were performed for 103 patients. A total of 95 curatively treated STS patients were included in survival analysis. The latissimus dorsi, used in 56% of cases, was the most frequently used free flap. Free flap success rate was 96%. There were 9% microvascular anastomosis complications and 15% wound complications. For curatively treated STS patients, local recurrence-free survival at 5 years was 73.1%, metastasis-free survival 58.3%, and overall disease-specific survival 68.9%. Functional results were good, with 75% of patients regaining normal or near-normal function after lower extremity, and 55% after upper extremity STS resection. Among curatively treated forequarter amputees, 5-year disease-free survival was 44%. In the palliatively treated group median time until disease death was 14 months. Microvascular reconstruction after extremity soft tissue sarcoma resection is safe and reliable, and produces well-healing wounds allowing early oncological treatment. Oncological outcome after these procedures is comparable to that of other extremity sarcoma patients. Functional results are generally good. Forequarter amputation is a useful treatment option for soft tissue tumors of the shoulder girdle and proximal upper extremity. When free flap coverage of extended forequarter amputation is required, the preferable flap is a fillet flap from the amputated extremity. Acceptable oncological outcome is achieved for curatively treated FQA patients. In the palliatively treated patient considerable periods of increased quality of life can be achieved.
Resumo:
Paikallisesti levinnyttä (T3-4 M0) ja luustoon levinnyttä (T1-4 M1) eturauhassyöpää sairastaneet potilaat satunnaistettiin kirurgiseen kastraatioon (orkiektomia) tai lääkkeelliseen kastraatioon lihaksensisäisellä polyestradiolifosfaatilla (PEP) annoksella 240 mg/kk. Verrattiin hoitojen kliinistä tehoa sekä sydän- ja verisuonikomplikaatioita (SV-komplikaatioita). Verrattiin myös hoitoa edeltäviä plasman testosteroni (T) ja estradioli (E2) pitoisuuksia T3-4 M0 ja T1-4 M1 potilaiden välillä sekä selvitettiin potilaiden yleistilan vaikutusta näihin hormonitasoihin. Lopuksi luotiin T1-4 M1 potilaille eturauhassyövän aiheuttaman kuoleman ennusteellinen riskiluokittelu kolmeen riskiryhmään käyttämällä hoitoa edeltäviä ennustetekijöitä. Kliinisessä tehossa ei orkiektomian ja PEP-hoidon välillä todettu tilastollisesti merkitsevää eroa. Odotetusti T1-4 M1 potilaiden ennuste oli huonompi kuin T3-4 M0 potilaiden. T1-4 M1 potilailla ei ollut SV-kuolemissa hoitoryhmien välillä tilastollista eroa, mutta ei-tappavia SV-komplikaatioita oli PEP ryhmässä (5.9%) enemmän kuin orkiektomia ryhmässä (2.0%). T3-4 M0 potilailla PEP-hoitoon liittyi tilastollisesti merkitsevä SV-kuolleisuus riski orkiektomiaan verrattuna (p = 0.001). PEP ryhmässä 67% kuolemista oli akuutteja sydäninfarkteja. Tämä PEP hoitoon liittyvä sydäninfarktiriski (mukaan lukien myös ei-tappavat sydäninfarktit) oli merkitsevästi pienempi potilailla, joiden hoitoa edeltävä E2 taso oli vähintään 93 pmol/l (p = 0.022). E2 taso oli merkitsevästi matalampi T1-4 M1 potilailla (74.7 pmol/l) kuin T3-4 M0 potilailla (87.9 pmol/l), mutta vastaavaa eroa ei ollut T tasoissa. Sekä T3-4 M0 että T1-4 M1 potilailla yleistilan lasku osittain selitti yksilöllisen T ja E2 tasojen laskun. Eturauhassyövän aiheuttaman kuoleman riskiryhmäluokittelu (Rg) kolmeen ryhmään luotiin käyttämällä alkalista fosfataasia (AFOS), prostata spesifistä antigeenia (PSA), laskoa (La) ja potilaan ikää. Yksi riskipiste annettiin, jos AFOS > 180 U/l (tällä hetkellä käytössä olevalla menetelmällä AFOS > 83 U/l), PSA > 35 µg/l, La > 80 mm/h ja ikä < 60 vuotta. Lopuksi pisteet laskettiin yhteen. Muodostettiin seuraavat ryhmät: Rg-a (0 -1 riskipistettä), Rg-b (2 riskipistettä) ja Rg-c (3 – 4 riskipistettä). Eturauhassyövän aiheuttama kuoleman riski lisääntyi merkitsevästi siirryttäessä riskiryhmästä seuraavaan (p < 0.001). Rg-luokittelu oli kliinisesti käytännöllinen ja hyvä havaitsemaan huonon ennusteen potilaat.
Resumo:
Space-time block codes (STBCs) that are single-symbol decodable (SSD) in a co-located multiple antenna setting need not be SSD in a distributed cooperative communication setting. A relay network with N relays and a single source-destination pair is called a partially-coherent relay channel (PCRC) if the destination has perfect channel state information (CSI) of an the channels and the relays have only the phase information of the source-to-relay channels. In our earlier work, we had derived a set of necessary and sufficient conditions for a distributed STBC (DSTBC) to be SSD for a PCRC. Using these conditions, in this paper we show that the possibility of channel phase compensation operation at the relay nodes using partial CSI at the relays increases the possible rate of SSD DSTBCs from 2/N when the relays do not have CSI to 1/2, which is independent of N. We also show that when a DSTBC is SSD for a PCRC, then arbitrary coordinate interleaving of the in-phase and quadrature-phase components of the variables does not disturb its SSD property. Using this property we are able to construct codes that are SSD and have higher rate than 2/N but giving full diversity only for signal constellations satisfying certain conditions.
Resumo:
Structural relaxation behavior of a rapidly quenched (RQ) and a slowly cooled Pd40Cu30Ni10P20 metallic glass was investigated and compared. Differential scanning calorimetry was employed to monitor the relaxation enthalpies at the glass transition temperature, T-g , and the Kolrausch-Williams-Watts (KWW) stretched exponential function was used to describe its variation with annealing time. It was found that the rate of enthalpy recovery is higher in the ribbon, implying that the bulk is more resistant to relaxation at low temperatures of annealing. This was attributed to the possibility of cooling rate affecting the locations where the glasses get trapped within the potential energy landscape. The RQ process traps a larger amount of free volume, resulting in higher fragility, and in turn relaxes at the slightest thermal excitation (annealing). The slowly cooled bulk metallic glass (BMG), on the other hand, entraps lower free volume and has more short-range ordering, hence requiring a large amount of perturbation to access lower energy basins.
Resumo:
Space-time block codes (STBCs) obtained from non-square complex orthogonal designs are bandwidth efficient compared to those from square real/complex orthogonal designs for colocated coherent MIMO systems and has other applications in (i) non-coherent MIMO systems with non-differential detection, (ii) Space-Time-Frequency codes for MIMO-OFDM systems and (iii) distributed space-time coding for relay channels. Liang (IEEE Trans. Inform. Theory, 2003) has constructed maximal rate non-square designs for any number of antennas, with rates given by [(a+1)/(2a)] when number of transmit antennas is 2a-1 or 2a. However, these designs have large delays. When large number of antennas are considered this rate is close to 1/2. Tarokh et al (IEEE Trans. Inform. Theory, 1999) have constructed rate 1/2 non-square CODs using the rate-1 real orthogonal designs for any number of antennas, where the decoding delay of these codes is less compared to the codes constructed by Liang for number of transmit antennas more than 5. In this paper, we construct a class of rate-1/2 codes for arbitrary number of antennas where the decoding delay is reduced by 50% when compared with the rate-1/2 codes given by Tarokh et al. It is also shown that even though scaling the variables helps to lower the delay it can not be used to increase the rate.