774 resultados para chronic postoperative pain, inguinal herniorrhaphy, groin pain, related factors.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

La circulation extracorporelle (CEC) est une technique utilisée en chirurgie cardiaque effectuée des milliers de fois chaque jour à travers le monde. L’instabilité hémodynamique associée au sevrage de la CEC difficile constitue la principale cause de mortalité en chirurgie cardiaque et l’hypertension pulmonaire (HP) a été identifiée comme un des facteurs de risque les plus importants. Récemment, une hypothèse a été émise suggérant que l'administration prophylactique (avant la CEC) de la milrinone par inhalation puisse avoir un effet préventif et faciliter le sevrage de la CEC chez les patients atteints d’HP. Toutefois, cette indication et voie d'administration pour la milrinone n'ont pas encore été approuvées par les organismes réglementaires. Jusqu'à présent, la recherche clinique sur la milrinone inhalée s’est principalement concentrée sur l’efficacité hémodynamique et l'innocuité chez les patients cardiaques, bien qu’aucun biomarqueur n’ait encore été établi. La dose la plus appropriée pour l’administration par nébulisation n'a pas été déterminée, de même que la caractérisation des profils pharmacocinétiques (PK) et pharmacodynamiques (PD) suite à l'inhalation. L'objectif de notre recherche consistait à caractériser la relation exposition-réponse de la milrinone inhalée administrée chez les patients subissant une chirurgie cardiaque sous CEC. Une méthode analytique par chromatographie liquide à haute performance couplée à un détecteur ultraviolet (HPLC-UV) a été optimisée et validée pour le dosage de la milrinone plasmatique suite à l’inhalation et s’est avérée sensible et précise. La limite de quantification (LLOQ) était de 1.25 ng/ml avec des valeurs de précision intra- et inter-dosage moyennes (CV%) <8%. Des patients souffrant d’HP pour lesquels une chirurgie cardiaque sous CEC était prévue ont d’abord été recrutés pour une étude pilote (n=12) et, par la suite, pour une étude à plus grande échelle (n=28) où la milrinone (5 mg) était administrée par inhalation pré-CEC. Dans l'étude pilote, nous avons comparé l'exposition systémique de la milrinone peu après son administration avec un nébuliseur pneumatique ou un nébuliseur à tamis vibrant. L’efficacité des nébuliseurs en termes de dose émise et dose inhalée a également été déterminée in vitro. Dans l'étude à plus grande échelle conduite en utilisant exclusivement le nébuliseur à tamis vibrant, la dose inhalée in vivo a été estimée et le profil pharmacocinétique de la milrinone inhalée a été pleinement caractérisé aux niveaux plasmatique et urinaire. Le ratio de la pression artérielle moyenne sur la pression artérielle pulmonaire moyenne (PAm/PAPm) a été choisi comme biomarqueur PD. La relation exposition-réponse de la milrinone a été caractérisée pendant la période d'inhalation en étudiant la relation entre l'aire sous la courbe de l’effet (ASCE) et l’aire sous la courbe des concentrations plasmatiques (ASC) de chacun des patients. Enfin, le ratio PAm/PAPm a été exploré comme un prédicteur potentiel de sortie de CEC difficile dans un modèle de régression logistique. Les expériences in vitro ont démontré que les doses émises étaient similaires pour les nébuliseurs pneumatique (64%) et à tamis vibrant (68%). Cependant, la dose inhalée était 2-3 fois supérieure (46% vs 17%) avec le nébuliseur à tamis vibrant, et ce, en accord avec les concentrations plasmatiques. Chez les patients, en raison des variations au niveau des facteurs liés au circuit et au ventilateur causant une plus grande dose expirée, la dose inhalée a été estimée inférieure (30%) et cela a été confirmé après récupération de la dose de milrinone dans l'urine 24 h (26%). Les concentrations plasmatiques maximales (Cmax: 41-189 ng/ml) et l'ampleur de la réponse maximale ΔRmax-R0 (0-65%) ont été observées à la fin de l'inhalation (10-30 min). Les données obtenues suite aux analyses PK sont en accord avec les données publiées pour la milrinone intraveineuse. Après la période d'inhalation, les ASCE individuelles étaient directement reliées aux ASC (P=0.045). Enfin, notre biomarqueur PD ainsi que la durée de CEC ont été identifiés comme des prédicteurs significatifs de la sortie de CEC difficile. La comparaison des ASC et ASCE correspondantes a fourni des données préliminaires supportant une preuve de concept pour l'utilisation du ratio PAm/PAPm comme biomarqueur PD prometteur et justifie de futures études PK/PD. Nous avons pu démontrer que la variation du ratio PAm/PAPm en réponse à la milrinone inhalée contribue à la prévention de la sortie de CEC difficile.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Defaulting scheduled rehabilitation therapy may result in increased adverse outcomes such as permanent disability and increased healthcare costs. Concomitantly, there is evidence to suggest that early and continued rehabilitation of children with congenital disabilities can improve outcomes significantly. This study was conducted to determine factors contributing to caregivers’ defaulting scheduled rehabilitation therapy sessions. Methods A descriptive cross sectional study was carried out at Chitungwiza Central Hospital, a tertiary facility offering in and outpatient rehabilitation services in Zimbabwe. Caregivers of children who had congenital disabilities (N=40) and who had a history of defaulting treatment but were available during the data collection period responded to an interviewer administered questionnaire. Data were analysed for means and frequencies using STATA 13. Results Factors that contributed to caregivers defaulting scheduled therapy included economic constraints (52%), child related factors (43%), caregiver related factors (42%), service centred factors (30%) and psychosocial factors (58%). Majority of the caregivers (98%) were motivated to attend therapy by observable improvements in their children. Other motivators were incentives given in the rehabilitation department (45%), availability of rehabilitation personnel to provide the required services (48%) and psychosocial support from fellow caregivers, families and the rehabilitation staff (68%). Although all the caregivers could not distinguish occupational therapy from physiotherapy services they all reported that therapy was important. Conclusions A combination of psychosocial, economic, child centred and service centred factors contributed to caregivers defaulting scheduled therapy. Interventions that may potentially improve caregiver attendance to scheduled therapy include community outreach services, efficient rehabilitation service provision at the hospitals, and facilitation of income generating programmes for caregivers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose was to determine the prevalence and related factors of vitamin D (VitD) insufficiency in adolescents and young adults with perinatally acquired human immunodeficiency virus. A cohort of 65 patients (17.6 ± 2 years) at the Federal University of Rio de Janeiro, Brazil, were examined for pubertal development, nutrition, serum parathormone and serum 25-hydroxyvitamin D [s25(OH)D]. s25(OH)D levels < 30 ng/mL (< 75 nmol/L) were defined as VitD insufficiency. CD4+ T-cell counts and viral load, history of worst clinical status, immunologic status as nadir, current immunologic status, and antiretroviral (ART) regimen were also evaluated as risk factors for VitD insufficiency. Mean s25(OH)D was 37.7 ± 13.9 ng/mL and 29.2% had VitD insufficiency. There was no difference between VitD status and gender, age, nutritional status, clinical and immunological classification, and type of ART. Only VitD consumption showed tendency of association with s25(OH)D (p = 0.064). Individuals analysed in summer/autumn season had a higher s25(OH)D compared to the ones analysed in winter/spring (42.6 ± 14.9 vs. 34.0 ± 11.9, p = 0.011). Although, the frequency of VitD insufficiency did not differ statistically between the groups (summer/autumn 17.9% vs. winter/spring 37.8%, p = 0.102), we suggest to monitor s25(OH)D in seropositive adolescents and young adults, especially during winter/spring months, even in sunny regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and objective: Participation in colorectal cancer (CRC) screening varies widely among different countries and different socio-demographic groups. Our objective was to assess the effectiveness of three primary-care interventions to increase CRC screening participation among persons over the age of 50 years and to identify the health and socio-demographic-related factors that determine greater participation. Methods: We conducted a randomized experimental study with only one post-test control group. A total of 1,690 subjects were randomly distributed into four groups: written briefing; telephone briefing; an invitation to attend a group meeting; and no briefing. Subjects were evaluated 2 years post-intervention, with the outcome variable being participation in CRC screening. Results: A total of 1,129 subjects were interviewed. Within the groups, homogeneity was tested in terms of socio-demographic characteristics and health-related variables. The proportion of subjects who participated in screening was: 15.4% in the written information group (95% confidence interval [CI]: 11.2-19.7); 28.8% in the telephone information group (95% CI: 23.6-33.9); 8.1% in the face-to-face information group (95% CI: 4.5-11.7); and 5.9% in the control group (95% CI: 2.9-9.0), with this difference proving statistically significant (p < 0.001). Logistic regression showed that only interventions based on written or telephone briefing were effective. Apart from type of intervention, number of reported health problems and place of residence remained in the regression model. Conclusions: Both written and telephone information can serve to improve participation in CRC screening. This preventive activity could be optimized by means of simple interventions coming within the scope of primary health-care professionals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de Mestrado, Gestão de Unidades de Saúde, Faculdade de Economia, Universidade do Algarve, 2016

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Eradication of bovine tuberculosis (bTB) through the application of test-and-cull programs is a declared goal of developed countries in which the disease is still endemic. Here, longitudinal data from more than 1,700 cattle herds tested during a 12 year-period in the eradication program in the region of Madrid, Spain, were analyzed to quantify the within-herd transmission coefficient (β) depending on the herd-type (beef/dairy/bullfighting). In addition, the probability to recover the officially bTB free (OTF) status in infected herds depending on the type of herd and the diagnostic strategy implemented was assessed using Cox proportional hazard models. RESULTS Overall, dairy herds showed higher β (median 4.7) than beef or bullfighting herds (2.3 and 2.2 respectively). Introduction of interferon-gamma (IFN-γ) as an ancillary test produced an apparent increase in the β coefficient regardless of production type, likely due to an increase in diagnostic sensitivity. Time to recover OTF status was also significantly lower in dairy herds, and length of bTB episodes was significantly reduced when the IFN-γ was implemented to manage the outbreak. CONCLUSIONS Our results suggest that bTB spreads more rapidly in dairy herds compared to other herd types, a likely cause being management and demographic-related factors. However, outbreaks in dairy herds can be controlled more rapidly than in typically extensive herd types. Finally, IFN-γ proved its usefulness to rapidly eradicate bTB at a herd-level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to design and process-related factors, there are local variations in the microstructure and mechanical behaviour of cast components. This work establishes a Digital Image Correlation (DIC) based method for characterisation and investigation of the effects of such local variations on the behaviour of a high pressure, die cast (HPDC) aluminium alloy. Plastic behaviour is studied using gradient solidified samples and characterisation models for the parameters of the Hollomon equation are developed, based on microstructural refinement. Samples with controlled microstructural variations are produced and the observed DIC strain field is compared with Finite Element Method (FEM) simulation results. The results show that the DIC based method can be applied to characterise local mechanical behaviour with high accuracy. The microstructural variations are observed to cause a redistribution of strain during tensile loading. This redistribution of strain can be predicted in the FEM simulation by incorporating local mechanical behaviour using the developed characterization model. A homogeneous FEM simulation is unable to predict the observed behaviour. The results motivate the application of a previously proposed simulation strategy, which is able to predict and incorporate local variations in mechanical behaviour into FEM simulations already in the design process for cast components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introducción: Contar con un diagnóstico de las condiciones en seguridad y salud en el trabajo en el país permite crear estrategias para minimizar los problemas de la población trabajadora. En Colombia existe el observatorio del Instituto Nacional de Salud, sin embargo, no cuenta, en ninguno de sus tópicos, con información y análisis sobre la salud y seguridad de la población trabajadora. Objetivo: Determinar las condiciones de salud de la población atendida en la IPS SALUD OCUPACIONAL DE LOS ANDES LDTA en la ciudad de Bogotá, durante el año 2015. Materiales y métodos: Se realizó una prueba piloto del observatorio de salud y seguridad en el trabajo mediante un estudio de corte transversal, donde se tomó una base de datos de pacientes evaluados en la IPS SALUD OCUPACIONAL DE LOS ANDES LDTA, de la ciudad de Bogotá D.C. que contiene información de exámenes médicos ocupacionales realizados en el 2015 en la plataforma ISISMAWEB con una muestra representativa de 1923 registros. Se incluyeron variables sociodemográficas y laborales, los paraclínicos registrados como alterados más prevalentes, los diagnósticos y dictámenes emitidos en la población estudiada y las recomendaciones personales dadas por el sistema de gestión de la empresa. Se realizó un análisis descriptivo y para el estudio de las interacciones se empleó el Chi-cuadrado. Resultados: El 62,1% de la población fueron hombres con edad promedio de 34.8 años (DE 10,521). El 41.5% tuvieron estudios secundarios. La evaluación médica más realizada fue el examen de ingreso en el 30.5% de los casos. El cargo operadores de instalaciones y máquinas y ensambladores represento el 27.9% y en última medida los profesionales de nivel medio en operaciones financieras y administrativas con el 0.5%. El diagnostico CIE 10 emitido más frecuente fue con el 15,8% el código Z100 (Examen de salud ocupacional), seguido del Trastorno de la refracción no especificado (H527) con el 9,0%. En cuanto a las recomendaciones generales la que más se repitió fue examen periódico con un 30%. La recomendación preventiva más frecuente fue osteomuscular con el 36,5%. Las recomendaciones SVE de mayor prevalencia fueron ergonómicas con un 40,7%. Se encontraron asociaciones (p<0.05) entre las variables escolaridad, género y estrato. Conclusiones: Se deben optimizar los mecanismos de recolección del dato para ser más viable su evaluación y asociación. Hay un subregístro importante de segundos diagnósticos asociado al no registro de los paraclínicos. Este estudio plantea un modelo a seguir para poder desarrollar el observatorio nacional de salud y seguridad en el trabajo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cysticercosis results from the ingestion Taenia solium eggs directly by faecal-oral route or contaminated food or water. Human tapeworm carriers who have become infected after ingesting pork meat contaminated with cysticerci release these eggs. Cysticercosis occurs after tapeworm eggs are ingested by an intermediate host (pig or human) and then hatch, migrate, and lodge in the host's tissues, where they develop onto larval cysticerci. When they lodged in the central nervous system of humans, results in the disease condition called Neurocysticercosis (NCC), with a heterogeneous manifestations depending of the locations of cysts, number, size and their stage of evolution (1). Consequently the prognostic ranges from asymptomatic to situations leading to death in 2% to 9.8%. of cases (7) In swine’s there are few studies, but recent works have proved that animals, for the same reasons, also have neurological abnormalities, expressed by seizures, stereotypic walk in circles, chewing motions with foamy salivation included tonic muscle contractions followed by a sudden diminution in all muscle tone leading to collapse (2). Conventional domestic wastewater treatment processes may not be totally effective in inactivating parasites eggs from Taenia solium, allowing some contamination of soils and agricultural products (11). In Portugal there are some evidence of aggregation of human cysticercosis cases in specific regions, bases in ecological design studies (6). There are few information about human tapeworm carriers and social and economic factors associated with them. Success in knowledge and consequently in lowering transmission is limited by the complex network of biological and social factors that maintain the spread. Effective control of mostly zoonosis require One Health approach, after a real knowledge and transparency in the information provided by the institutions responsible for both animal and human health, allowing sustained interventions targeted at the transmission cycle's crucial nodes. In general, the model used to control, reflects a rural reality, where pigs are raised freely, poor sanitation conditions and incipient sanitary inspection. In cysticercosis, pigs are obligate intermediate hosts and so considered as first targets for control and used as sentinels to monitor environmental T. solium contamination (3). Usually environmental contamination with Taenia spp. eggs is a key issue in most of studies with landscape factors influencing presence of Taenia spp. antigens in both pigs and humans (5). Soil-related factors as well as socio-economic and behavioural factors are associated with the emergence of significant clustering human cysticercosis (4,5). However scarce studies has been produced in urban environmental and in developed countries with the finality to characterize the spatial pattern. There are still few data available regarding its prevalence and spatial distribution; Transmission patterns are likely to exhibit correlations as housing conditions, water supply, basic sanitation, schooling and birthplace of the individual or relatives, more than pigs rearing free, soil conditions (9). As a matter of fact, tapeworm carriers from endemic zones can auto-infect or transmit infection to other people or arrive already suffering NCC (as a result of travelling to or being a citizen from an endemic cysticercosis country) to a free cysticercosis country. Transmission is fecal-oral; this includes transmission through person-to-person contact, through autoinfection, or through contaminated food This has been happening in different continents as North America (5.4–18% been autochthonous), Europe and Australia (7). Recently, case reports of NCC have also emerged from Muslim countries. (10). Actually, different papers relate an epidemic situation in Spain and Portugal (7, 8). However the kind of study done does not authorize such conclusion. There are no evidence that infections were acquired in Portugal and there are not characterized the mode of transmission. Papers with these kind of information will be allow to have economic consequences resulted from artificial trade barriers with serious consequences for pig producers and pig meat trade. We need transparency in information’s that allow provide the basis to support the development and targeting of future effective control programmes (and prove we need that). So, to have a real picture of the disease, it is necessary integrate data from human, animal and environmental factors surrounding human and pig cases to characterize the pattern of the transmission. The design needs to be able to capture unexpected, and not common outcomes (routine data). We need to think “One Health” to get a genuine image of the situation.