991 resultados para Caxton, William, ca. 1422-1491 or 2.
Resumo:
The outcome effect occurs where an evaluator, who has knowledge of the outcome of a judge's decision , assesses the quality of the judgment of that decision maker. If the evaluator has knowledge of a negative outcome, then that knowledge negatively influences his or her assessment of the ex ante judgment. For instance, jurors in a lawsuit brought against an auditor for alleged negligence are informed of an undetected fraud, even though an unqualified opinion was issued. This paper reports the results of an experiment in an applied audit judgment setting that examined methods of mitigating the outcome effect by means of instructions. The results showed that simply instructing or warning the evaluator about the potential biasing effects of outcome information was only weakly effective. However, instructions that stressed either (1) the cognitive non-normativeness of the outcome effect or (2) the seriousness and gravity of the evaluation ameliorated the effect significantly. From a theoretical perspective, the results suggest that there may both motivational and cognitive components to the outcome effect. In all, the findings suggest awareness of the outcome effect and use of relatively nonintrusive instructions to evaluators may effectively counteract the potential for the outcome bias.
Resumo:
Objectives: To determine (i) factors which predict whether patients hospitalised with acute myocardial infarction (AMI) receive care discordant with recommendations of clinical practice guidelines; and (ii) whether such discordant care results in worse outcomes compared with receiving guideline-concordant care. Design: Retrospective cohort study. Setting: Two community general hospitals. Participants: 607 consecutive patients admitted with AMI between July 1997 and December 2000. Main outcome measures: Clinical predictors of discordant care; crude and risk-adjusted rates of inhospital mortality and reinfarction, and mean length of hospital stay. Results: At least one treatment recommendation for AMI was applicable for 602 of the 607 patients. Of these patients, 411(68%) received concordant care, and 191 (32%) discordant care. Positive predictors at presentation of discordant care were age > 65 years (odds ratio [OR], 2.5; 95% Cl, 1.7-3.6), silent infarction (OR, 2.7; 95% Cl, 1.6-4.6), anterior infarction (OR, 2.5; 95% Cl, 1.7-3.8), a history of heart failure (OR, 6.3; 95% Cl, 3.7-10.7), chronic atrial fibrillation (OR, 3.2; 95% Cl, 1.5-6.4); and heart rate greater than or equal to 100 beats/min (OR, 2.1; 95% Cl, 1.4-3.1). Death occurred in 12.0% (23/191) of discordant-care patients versus 4.6% (19/411) of concordant-care patients (adjusted OR, 2.42; 95% Cl, 1.22-4.82). Mortality was inversely related to the level of guideline concordance (P = 0.03). Reinfarction rates also tended to be higher in the discordant-care group (4.2% v 1.7%; adjusted OR, 2.5; 95% Cl, 0.90-7.1). Conclusions: Certain clinical features at presentation predict a higher likelihood of guideline-discordant care in patients presenting with AMI Such care appears to increase the risk of inhospital death.
Resumo:
Two studies were conducted to examine the effects of including NaCl at various rates in grain-based supplements for Friesian cows grazing established, dominant (>90%), rainfed kikuyu (Pennisetum clandestinum cv. Common) pastures during summer and autumn in a humid sub-tropical environment. In study 1 (19 January-27 March 1998), 48 cows (36 multiparous, 12 primiparous; 27-96 days postpartum) were allocated to one of four groups based on genetic merit, milk production, liveweight (LW) and days postpartum. They were fed (2.7 kg dry matter (DM) per cow, twice-a-day) one of four isoenergetic and isonitrogenous barley grain-based concentrates containing NaCl at concentrations (% as-fed) of either 0 (SC1), 1.1 (SC2), 2.2 (SC3) or 3.3 (SC4). Maximum temperature humidity index (THImax) was greater than or equal to78 during 50% of the experimental period. Concentrate NaCl content had no effect (P>0.05) on daily milk yield or LW change but daily yields of 4% fat corrected milk (FCM), fat and protein were higher (P0.05) among treatments at 7.6+/-1.24 kg DM per cow. In study 2 (18 January 1999-1 March 1999), 48 cows (32 pluriparous, 16 primiparous: 32-160 days postpartum) were fed (2.7 kg DM per cow twice-a-day) one of two isoenergetic and isonitrogenous barley grain-based concentrates containing NaCl at concentrations (% as-fed) of 0 (control) or 2.2 (HSC). THImax was greater than or equal to78 during 34% of days in the experimental period. Yields of milk, FCM, fat and protein were lower (P0.05) by concentrate NaCl content. These studies indicate that NaCl supplementation can be beneficial in terms of milk production during warm, humid conditions as opposed to milder conditions. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Background: The aim of this study was to examine minor physical anomalies and quantitative measures of the head and face in patients with psychosis vs healthy controls. Methods: Based on a comprehensive prevalence study of psychosis, we recruited 310 individuals with psychosis and 303 controls. From this sample, we matched 180 case-control pairs for age and sex. Individual minor physical anomalies and quantitative measures related to head size and facial height and depth were compared within the matched pairs. Based on all subjects, we examined the specificity of the findings by comparing craniofacial summary scores in patients with nonaffective or affective psychosis and controls. Results: The odds of having a psychotic disorder were increased in those with wider skull bases (odds ratio [OR], 1.40; 95% confidence interval [CI], 1.02-1.17), smaller lower-facial heights (glabella to subnasal) (OR, 0.57; 95% CI, 0.44-0.75), protruding ears (OR, 1.72; 95% CI, 1.05-2.82), and shorter (OR, 2.29; 95% CI, 1.37-3.82) and wider (OR, 2.28; 95% CI, 1.43-3.65) palates. Compared with controls, those with psychotic disorder had skulls that were more brachycephalic. These differences were found to distinguish patients with nonaffective and affective psychoses from controls. Conclusions: Several of the features that differentiate patients from controls relate to the development of the neuro-basicranial complex and the adjacent temporal and frontal lobes. Future research should examine both the temporal lobe and the middle cranial fossa to reconcile our anthropomorphic findings and the literature showing smaller temporal lobes in patients with schizophrenia. Closer attention to the skull base may provide clues to the nature and timing of altered brain development in patients with psychosis.
Resumo:
OBJECTIVE Because there is discordance between different immunoassay values for serum hGH, and because clinical state may not correlate with immunoreactive hGH, we have developed an assay to accurately measure serum hGH somatogenic bioactivity. The results of this assay were compared with the Elegance two-site ELISA assay across 135 patient samples in a variety of clinical states. DESIGN The somatogenic assay was based on stable expression of hGH receptor in the murine BaF line, allowing these cells to proliferate in response to hGH. To eliminate interference by other growth factors in serum, we created a specific antagonist of the hGH receptor (similar to Trovert or Pegvisomant) which allowed us to obtain a true measure of hGH somatogenic activity by subtraction of the activity in the presence of the antagonist. The assay was carried out in microtiter plates over 24 h, with oxidation of a chromogenic tetrazolium salt (MTT) as the endpoint. PATIENTS These encompassed a number of different clinical conditions related to short stature, including idiopathic short stature, neurosecretory dysfunction and renal failure, as well as obese patients on dietary restriction and normal volunteers. MEASUREMENTS In addition to the colourimetric (MTT) response to hGH, we measured free hGH by stripping out GHBP-bound hGH using beads coupled to a monoclonal antibody to the GHBP (GH binding protein). All samples were measured in both bioassay and ELISA assay. RESULTS This bioassay was sensitive (5 mU/l or 2 mug/l) and precise, and not subject to interference by the GHBP. There was a good correlation (r = 0.95) between bioactivity and immunoactivity across clinical states. There was, however, an increased bioactivity during secretory peaks (over 25 mU/l), which has been reported previously for the Nb2 bioassay. Free hGH did not correlate with clinical state. CONCLUSIONS Because the results of the Elegance ELISA and the bioassay correlate well, even though there is greater bioactivity at higher hormone concentrations, it is evident that an appropriate immunoassay is able to act as a reliable indicator for clinical assessment. In those rare cases where bio-inactive GH exists, our bioassay should provide an appropriate means to demonstrate this.
Resumo:
A grey snapper (Lutjanus griseus), a grouper (Serranidae) and a blackjack (Caranx lugubris) were implicated in three different ciguatera poisonings in Guadeloupe, French West Indies. A mouse bioassay indicated toxicity for each specimens: 0.5-1, greater than or equal to 1 and > 1 M Ug g(-1), respectively. After purification by gel filtration chromatography, the samples were analysed by high-performance liquid chromatography coupled to mass spectrometry (LC-MS). The toxin profiles differ from one fish to another. C-CTX-1 was detected at 0.24, 0.90 and 13.8 ng g(-1) flesh in the snapper, grouper and jack, respectively. It contributed only to part of the whole toxicity determined by the mouse bioassay. Other toxins identified were C-CTX-2 (a C-CTX-1 epimer), three additional isomers of C-CTX-1 or -2, and five ciguatoxin congeners (C-CTX-1127, C-CTX-1143 and its isomer C-CTX-1143a, and C-CTX-1157 and its isomer C-CTX-1157b). Putative hydroxy-polyether-like compounds were also detected in the flesh of the grouper with [M+ + H](+) ions at m/z 851.51, 857.50, 875.51, 875.49 and 895.54 Da. Some of these compounds have the same mass range as some known dinoflagellate toxins. In conclusion, this study confirms the usefulness of LC-MS analysis to determine the ciguatoxins levels and the toxin profile in fish flesh hazardous to humans.
Resumo:
Objective: It has been suggested that parental occupation, particularly farming, increased the risk of Ewing's sarcoma in the offspring. In a national case-control study we examined the relationship between farm and other parental occupational exposures and the risk of cancer in the offspring. Methods: Cases were 106 persons with confirmed Ewing's sarcoma or peripheral primitive neuroectodermal tumor. Population-based controls (344) were selected randomly via telephone. Information was collected by interview (84% face-to-face). Results: We found an excess of case mothers who worked on farms at conception and/or pregnancy (odds ratio (OR) = 2.3, 95% confidence interval (CI) 0.5-12.0) and a slightly smaller excess of farming fathers; more case mothers usually worked as laborers, machine operators, or drivers (OR = 1.8, 95% CI 0.9-3.9). Risk doubled for those whose mothers handled pesticides and insecticides, or fathers who handled solvents and glues, and oils and greases. Further, more cases lived on farms (OR = 1.6, 95% CI 0.9-2.8). In the 0-20 years group, the risk doubled for those who ever lived on a farm (OR = 2.0, 95% CI 1.0-3.9), and more than tripled for those with farming fathers at conception and/or pregnancy (OR = 3.5, 95% CI 1.0-11.9). Conclusions: Our data support the general hypothesis of an association of Ewing's sarcoma family of tumors with farming, particularly at younger ages, who represent the bulk of cases, and are more likely to share etiologic factors.
Resumo:
For repairable items sold with free replacement warranty, the actions available to the manufacturer to rectify failures under warranty are to (1) repair the failed item or (2) replace it with a new one. A proper repair-replace strategy can reduce the expected cost of servicing the warranty. In this paper, we study repair-replace strategies for items sold with a two-dimensional free replacement warranty. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Background: Some melanomas form on sun-exposed body sites, whereas others do not. We previously proposed that melanomas at different body sites arise through different pathways that have different associations with melanocytic nevi and solar keratoses. We tested this hypothesis in a case-case comparative study of melanoma patients in Queensland, Australia. Methods: We randomly selected patients from among three prespecified groups reported to the population-based Queensland Cancer Registry: those with superficial spreading or nodular melanomas of the trunk (n = 154, the reference group), those with such melanomas of the head and neck (n = 77, the main comparison group), and those with lentigo maligna melanoma (LMM) (n = 75, the chronic sun-exposed group). Each participant completed a questionnaire, and a research nurse counted melanocytic nevi and solar keratoses. We calculated exposure odds ratios (ORs) and 95% confidence intervals (CIs) to quantify the association between factors of interest and each melanoma group. Results: Patients with head and neck melanomas, compared with patients with melanomas of the trunk, were statistically significantly less likely to have more than 60 nevi (OR = 0.34, 95% CI = 0.15 to 0.79) but were statistically significantly more likely to have more than 20 solar keratoses (OR = 3.61, 95% CI = 1.42 to 9.17) and also tended to have a past history of excised solar skin lesions (OR = 1.87, 95% CI = 0.89 to 3.92). Patients with LMM were also less likely than patients with truncal melanomas to have more than 60 nevi (OR = 0.32, 95% CI = 0.14 to 0.75) and tended toward more solar keratoses (OR = 2.14, 95% CI = 0.88 to 5.16). Conclusions: Prevalences of nevi and solar keratoses differ markedly between patients with head and neck melanomas or LMM and patients with melanomas of the trunk. Cutaneous melanomas may arise through two pathways, one associated with melanocyte proliferation and the other with chronic exposure to sunlight.
Resumo:
The effects of various fallow management systems and cropping intensities on water infiltration were measured on an Alfisol at Ibadan in southwestern Nigeria. The objective was to determine the influence of the land use systems (a combination of crop-fallow sequences and intercropping types) on soil hydraulic properties obtained by disc permeameter and double-ring infiltration measurements. The experiment was established in 1989 as a split-plot design with four replications. The main plots were natural fallow, planted Pueraria phaseoloides and planted Leucaena leucocephala. The subplots were 1 year of maize/cassava intercrop followed by 3-year fallow (25% cropping intensity), or 2-year fallow (33% cropping intensity), or 1-year fallow (50% cropping intensity), or no fallow period (100% cropping intensity). Water infiltration rates and sorptivities were measured under saturated and unsaturated flow. Irrespective of land use, infiltration rates at the soil surface (121-324 cm h(-1)) were greater than those measured at 30 cm depth (55-144 cm h(-1)). This indicated that fewer large pores were present below 30 cm depth compared with 0-30 cm, depth. Despite some temporal variation, sorptivities with the highest mean value of 93.5 cm h(-1/2) increased as the cropping intensity decreased, suggesting a more continuous macropore system under less intensive land use systems. This was most likely due to continuous biopores created by perennial vegetation under long fallow systems. Intercropped maize and cassava yields also increased as cropping intensity decreased. The weak relationship between crop yields and hydraulic conductivity/infiltration rates suggests that the rates were not limiting.
Resumo:
A associação entre experiências adversas na infância e o desencadeamento de depressão ou dor crônica na vida adulta tem sido documentada, assim como a relação entre os sintomas de dor crônica e depressão. No entanto, há poucos estudos avaliando o papel da exposição a experiências adversas na infância na ocorrência dessa comorbidade. O objetivo deste trabalho é avaliar a influência da exposição a experiências adversas na infância na ocorrência de dor crônica, de depressão e na comorbidade dor crônica e depressão na vida adulta, em uma amostra da população geral adulta (maiores de 18 anos) residente na Região metropolitana de São Paulo, Brasil. Os dados são resultantes do Estudo Epidemiológicos dos Transtornos Mentais São Paulo Megacity. Os respondentes foram avaliados usando a versão desenvolvida para o Estudo Mundial de Saúde Mental do Composite International Diagnostic Interview da Organização Mundial da Saúde (WMH-CIDI), que é composto por módulos clínicos e nãoclínicos provendo diagnósticos de acordo com os critérios do Manual Diagnóstico e Estatístico dos Transtornos Mentais 4ª edição (DSM-IV). Um total de 5.037 indivíduos foi entrevistado, com uma taxa global de resposta de 81,3%. Foram realizadas análises descritivas para médias e proporções, e associações (Razões de Chances – OR) entre experiências adversas na infância, dor crônica e depressão através de regressão logística. Todas as análises foram realizadas através do programa estatístico Data Analysis and Statistical Software versão 12.0 (STATA 12.0), com testes bi-caudais com nível de significância de 5%. Uma elevada taxa de prevalência de dor crônica (31%, Erro Padrão [ER]=0.8) foi encontrada, Dor Crônica esteve associada aos transtornos de ansiedade (OR=2,3; 95% IC=1,9 – 3,0), transtornos de humor (OR=3,3; IC=2,6 – 4,4) em qualquer transtorno mental (OR=2,7; 95% IC=2,3 – 3,3). As adversidades na infância estiveram fortemente associadas aos respondentes com dor crônica e depressão concomitante, principalmente quanto ao abuso físico (OR=2,7; 95% IC=2,1 – 3,5) e sexual (OR=7,4; 95% IC=3,4 – 16,1).
Resumo:
Introdução: Os ensaios de liberação do interferon- γ (ELIG) surgiram como uma alternativa para o diagnóstico de infecção latente pelo Mycobacterium tuberculosis (ILTB). Neste estudo, nós comparamos o desempenho de um dos ELIG, teste Quantiferon TB Gold in tube – QFT, com a prova tuberculínica (PT) em dois pontos de corte (≥ 5 mm e ≥ 10 mm), em profissionais de saúde da atenção básica à saúde (ABS). Métodos: Estudo transversal realizado em profissionais de saúde da ABS de quatro capitais nacionais com alta incidência de TB. O resultado do teste QFT foi comparado com o resultado da PT nos pontos de corte ≥ 5mm e ≥ 10 mm. Resultados: Foram incluídos 632 profissionais de saúde. Ao considerar o ponto de corte ≥ 10 mm para a PT, a concordância entre QFT e a PT foi de 69% (k = 0,31) e para o ponto de corte ≥ 5 mm, a concordância entre os testes foi de 57% (k = 0,22). Devido a baixa concordância entre a PT e o QFT, nós avaliamos os possíveis fatores associados com a discordância entre eles. Ao comparar o grupo PT- / QFT- com o grupo PT+ / QFT-, no ponto de corte ≥ 5 mm, a idade entre 41-45 [OR = 2,70, IC 95%: 1,32-5,51] e 46-64 [OR = 2,04, IC 95%: 1,05-3,93], presença de cicatriz vacinal do BCG [OR = 2,72, IC 95%: 1,40-5,25] e trabalhar apenas na ABS [OR = 2,30, IC 95 %: 1,09-4,86] apresentaram associação estatística significativa. Para o ponto de corte ≥ 10 mm, a presença de cicatriz vacinal do BCG [OR = 2,26, IC 95%: 1,03-4,91], ter tido contato domiciliar com paciente portador de tuberculose ativa [OR = 1,72, IC 95%: 1,01-2,92] e ter feito a PT anteriormente [OR = 1,66, IC 95%: 1,05-2,62] revelaram associação estatística significativa. Curiosamente, a discordância observada no grupo PT- / QFT + não apresentou associação estatistica com nenhuma das variáveis consideradas, independentemente do ponto de corte da PT. Conclusões: Apesar de termos identificado que a vacina BCG contribuiu para a discordância entre os testes, as recomendações brasileiras para o início do tratamento da ILTB não devem ser alteradas devido as limitações do QFT.
Resumo:
INTRODUÇÃO: O diagnóstico e terapia antirretroviral precoce em lactentes, infectados pelo HIV por transmissão vertical, reduz a progressão do HIV e comorbidades que podem levar ao óbito. OBJETIVO GERAL: Avaliar o perfil clínico e epidemiológico em uma coorte de crianças e adolescentes com aids, infectados por transmissão vertical do HIV, por um período de onze anos, atendidos em hospital estadual de referência, no Estado do Espírito Santo. OBJETIVOS ESPECÍFICOS: 1. Descrever a frequência das comorbidades diagnosticadas após o diagnóstico de HIV e verificar sua distribuição, segundo dados demográficos, epidemiológicos e clínicos, e segundo a classificaão dos casos em uma coorte de crianças e adolescentes com aids. 2. Avaliar os fatores preditores de risco de progressão para aids e óbito e causas de morte. 3. Estimar a taxa de sobrevida. MÉTODOS: Coorte retrospectiva de crianças e adolescentes infectados pelo HIV, por transmissão vertical (TV), atendidas no Serviço de Atendimento Especializado (SAE) do Hospital Infantil Nossa Senhora da Glória (HINSG), de janeiro 2001 a dezembro 2011, em Vitória – ES/Brasil. A coleta de dados foi realizada em protocolo específico padronizado, e dados sobre as comorbidades, mortalidade e sua causa básica foram obtidos dos prontuários médicos, da Declaração de Óbito e do banco de dados SIM (Sistema de Informação sobre Mortalidade). O diagnóstico de aids e comorbidades foi de acordo com CDC (Centers for Disease Control and Prevention)/1994. RESULTADOS: Foi arrolado um total de 177 pacientes, sendo 97 (55%) do sexo feminino; 60 (34%) eram menores de1ano, 67 (38%) tinham de 1 a 5 anos e 50 (28%) tinham6 anos ou mais de idade no ingresso ao serviço. A mediana das idades na admissão foi de 30 meses (Intervalo Interquartis (IIQ) 25-75%: 5-72 meses). Em relação à classificaão clínico-imunológica, 146 pacientes (82,5%) apresentavam a forma moderada/grave no momento do ingresso no Serviço e 26 (14,7%) foram a óbito. Os sinais clínicos mais frequentes foram hepatomegalia (81,62%), esplenomegalia (63,8%), linfadenopatia (68,4%) e febre persistente (32,8%). As comorbidades mais frequentes foram anemia (67,2%), pneumonia/sepses/meningite - primeiro episódio (64,2%), OMA/sinusite recorrente (55,4%), infecções bacterianas graves recorrentes (47,4%) e dermatites (43,1%). Encontrou-se associação entre classificaão clínico-imunológica grave e ingresso no serviço com menos de um ano de idade com algumas comorbidades (p<0,001). O tempo total do acompanhamento dos pacientes foi de 11 anos, com mediana de cinco anos (IIQ: 2-8 anos). No final do período estudado, 132 (74,6%) pacientes estavam em acompanhamento, 11 (6,2%) foram transferidos para outros serviços eem oito (4,5%) houve perda de seguimento. Quanto ao óbito, observou-se uma redução de casos ao longo do tempo. A maioria dos pacientes que foram a óbito deu entrada no serviço com classificaão clínica imunológica grave (77%-20/26), apresentava anemia moderada/grave e estava em uso de terapia antirretroviral (TARV) por mais de 3 meses (17/24-71%).Os principais fatores de risco para o óbito foram: faixa etária < 1 ano (p=0,005), pneumonia por P. jirovecii (p=0,010), percentual de linfócito T CD4+ nadir <15% (p=0,012), anemia crônica (p=0,012), estágio clínico imunológico grave (p=0,003), infecções bacterianas graves recorrentes(p=0,003) e tuberculose (p=0,037). Ter iniciado TARV antes dos 6 meses de vida (diagnóstico e tratamento precoces) foi associado à sobrevida(OR 2,86, [Intervalo de Confiança (IC) de 95%: 1,12-7,25] p=0,027).O principal diagnóstico registrado para os óbitos foram infecções bacterianas graves (12/21-57%). Foi encontrada uma elevada taxa de sobrevida, com 85,3% de probabilidade de sobrevivência por mais de 10 anos (IC 95% 9,6-10,7). CONCLUSÕES: A maioria das crianças teve diagnóstico tardio da infecção pelo HIV aumentando o risco de progressão para aids e óbito por falta de tratamento precoce. A tendência de mortalidade das crianças infectadas pelo HIV se mostrou uma constante com queda nos dois últimos anos do estudo, e ainda persistem as infecções bacterianas como maior causa de óbito. Portanto, melhoria no cuidado pré-natal e acompanhamento pediátrico com vista ao diagnóstico precoce das crianças infectadas verticalmente devem fazer parte do cuidado integral à criança com aids, o que poderia reduzir a mortalidade destas crianças.
Resumo:
INTRODUÇÃO: A função do trato olivococlear medial é estudada pela supressão das emissões otoacústicas com o uso de uma estimulação contralateral e sofre influência da lateralidade do sistema nervoso central, não apresentando respostas iguais entre as orelhas. Uma disfunção neste trato pode implicar na geração do zumbido, porém isto ainda não foi confirmado. OBJETIVO: Estudar a supressão das emissões otoacústicas por produto de distorção em indivíduos com zumbido. MATERIAL E MÉTODO: Estudo caso-controle com 44 pessoas com zumbido, matriculadas no Ambulatório de Zumbido da Divisão de Clínica Otorrinolaringológica do Hospital das Clínicas da Universidade de São Paulo e 44 voluntários submetidos à emissão otoacústica por produto de distorção com e sem estimulação contralateral. Comparou-se os resultados das orelhas direita dos dois grupos. RESULTADOS: Houve associação entre zumbido e ausência de supressão em todas as freqüências estudadas (OR>2,1). CONCLUSÕES: Houve uma correlação entre menor efetividade do trato olivococlear medial e a presença de zumbido.
Resumo:
Vegetative propagation of lavender offers several advantages over sexual propagation, among them crop homogeneity and yield of higher quality essential oil. However, Lavandula species have been propagated mostly by seeds and are said to be recalcitrant to rooting when propagated by cuttings. During cutting propagation, one of the important variables that influence the rooting capacity of cuttings is the leaf retention. The objective of this work was to evaluate the influence of leaf retention on rooting of L. dentata cuttings. Apical cuttings of L. dentata of 10 cm in length, keeping approximately 1/3, 1/2 or 2/3 of their leaves were planted in commercial substrate Plantmax HT® under intermittent mist. After two months, averages of root number, length of the longest root, root fresh and dry weights, and the survival percentage were evaluated. Root length and fresh weight were statistically greater with 2/3 of leaf retention and when fewer leaves were kept on the cuttings, lower means of root dry weight was observed. Under the conditions applied in this study, greater leaf retention was better for rooting of L. dentata cuttings.