22 resultados para Total Parenteral Nutrition
em Helda - Digital Repository of University of Helsinki
Resumo:
Helicobacter pylori (H. pylori) infection is a major cause of chronic gastritis and peptic ulcer disease, and it is also designated as a class-I carcinogen for stomach cancer. The role of probiotics in the treatment of gastrointestinal infections is increasingly documented as an alternative or complement to antibiotics, with the potential to decrease the use of antibiotics or reduce their adverse effects. These studies were conducted to investigate the role of probiotics in the treatment of H. pylori infection. Various aspects included: an investigation of the effects of a probiotic combination consisting of Lactobacillus rhamnosus GG, L. rhamnosus LC705, Propionibacterium freudenreichii ssp. shermanii JS and Bifidobacterium breve Bb99 or B. lactis Bb12 as a supplementation to H. pylori eradication therapy, with special reference to tolerability, effectiveness, and microbiota alterations following the treatment; discovering the role of probiotics in vivo with H. pylori infected and uninfected patients, as well as with an in vitro model of H. pylori infection. The probiotic combination therapy was able to reduce significantly the total symptom score, which takes into account both the frequency and the severity of the adverse effects, during the eradication treatment. The supplementation did not improve the success of the eradication treatment significantly, though some difference was seen in the eradication percentages (91% vs. 79%). The quantities of predominant bacterial groups were altered significantly following the triple treatment. Probiotics slightly counteracted the effects of anti-H. pylori treatment, monitored as significantly less alterations in the total numbers of aerobes and lactobacilli/enterococci group bacteria. After probiotic intervention, L. rhamnosus GG adhered to a minority of the patients upper gastrointestinal mucosa, but all of the probiotics survived well through the gastrointestinal tract transit with and without antimicrobial treatment. Probiotic intervention decreased gastrin-17 levels in H. pylori infected patients and appeared to decrease the 13C-urea breath test values. In in vitro Caco-2 cell line experiments, probiotics inhibited H. pylori adhesion to intestinal epithelial cells. Both L. rhamnosus strains, P. freudenreichii ssp. shermanii JS and the combination inhibited the H. pylori-induced acute cell leakage. Simultaneously, both L.rhamnosus strains and the combination transiently improved the epithelial barrier function. The pro-inflammatory effects prevailed when the probiotics were used in combination. According to this series of studies, probiotic combination could have some potential in reducing adverse effects induced by H. pylori eradication treatment and beneficial effects on H. pylori infected subjects.
Resumo:
In Finland, suckler cow production is carried out in circumstances characterized by a long winter period and a short grazing period. The traditional winter housing system for suckler cows has been insulated or uninsulated buildings, but there is a demand for developing less expensive housing systems. In addition, more information is needed on new winter feeding strategies, carried out in inexpensive winter facilities with conventional (hay, grass silage, straw) or alternative (treated straw, industrial by-product, whole-crop silage) feeds. The new feeding techniques should not have any detrimental effects on animal welfare in order to be acceptable to both farmers and consumers. Furthermore, no official feeding recommendations for suckler cows are available in Finland and, thus, recommendations for dairy cows have been used. However, this may lead to over- or underfeeding of suckler cows and, finally, to decreased economic output. In Experiment I, second-calf beef-dairy suckler cows were used to compare the effects of diets based on hay (H) or urea-treated straw (US) at two feeding levels (Moderate; M vs. Low; L) on the performance of cows and calves. Live weight (LW) gain during the indoor feeding was lower for cows on level L than on level M. Cows on diet US lost more LW indoors than those on diet H. The cows replenished the LW losses on good pasture. Calf LW gain and cow milk production were unaffected by the treatments. Conception rate was unaffected by the treatments but was only 69%. Urea-treated straw proved to be a suitable winter feed for spring-calving suckler cows. Experiment II studied the effects of feeding accuracy on the performance of first- and second-calf beef-dairy cows and calves. In II-1, the day-to-day variation in the roughage offered ranged up to ± 40%. In II-2, the same variation was used in two-week periods. Variation of the roughages offered had minor effects on cow performance. Reproduction was unaffected by the feeding accuracy. Accurate feeding is not necessary for young beef-dairy crosses, if the total amount of energy offered over a period of a few weeks fulfills the energy requirements. Effects of feeding strategies with alternative feeds on the performance of mature beef-dairy and beef cows and calves were evaluated in Experiment III. Two studies consisted of two feeding strategies (Step-up vs. Flat-rate) and two diets (Control vs. Alternative). There were no differences between treatments in the cow LW, body condition score (BCS), calf pre-weaning LW gain and cow reproduction. A flat-rate strategy can be practised in the nutrition of mature suckler cows. Oat hull based flour-mill by product can partly replace grass silage and straw in the winter diet. Whole-crop barley silage can be offered as a sole feed to suckler cows. Experiment IV evaluated during the winter feeding period the effects of replacing grass silage with whole-crop barley or oat silage on mature beef cow and calf performance. Both whole-crop silages were suitable winter feeds for suckler cows in cold outdoor winter conditions. Experiment V aimed at assessing the effects of daily feeding vs. feeding every third day on the performance of mature beef cows and calves. No differences between the treatments were observed in cow LW, BCS, milk production and calf LW. The serum concentrations of urea and long-chain fatty acids were increased on the third day after feeding in the cows fed every third day. Despite of that the feeding every third day is an acceptable feeding strategy for mature suckler cows. Experiment VI studied the effects of feeding levels and long-term cold climatic conditions on mature beef cows and calves. The cows were overwintered in outdoor facilities or in an uninsulated indoor facility. Whole-crop barley silage was offered either ad libitum or restricted. All the facilities offered adequate shelter for the cows. The restricted offering of whole-crop barley silage provided enough energy for the cows. The Finnish energy recommendations for dairy cows were too high for mature beef breed suckler cows in good body condition at housing, even in cold conditions. Therefore, there is need to determine feeding recommendations for suckler cows in Finland. The results showed that the required amount of energy can be offered to the cows using conventional or alternative feeds provided at a lower feeding level, with an inaccurate feeding, flat-rate feeding or feeding every third day strategy. The cows must have an opportunity to replenish the LW and BCS losses at pasture before the next winter. Production in cold conditions can be practised in inexpensive facilities when shelter against rain and wind, a dry resting place, adequate amounts of feed suitable for cold conditions and water are provided for the animals as was done in the present study.
Resumo:
Background: Malnutrition is a common problem for residents of nursing homes and long-term care hospitals. It has a negative influence on elderly residents and patients health and quality of life. Nutritional care seems to have a positive effect on elderly individuals nutritional status and well-being. Studies of Finnish elderly people s nutrition and nutritional care in institutions are scarce. Objectives: The primary aim was to investigate the nutritional status and its associated factors of elderly nursing home residents and long-term care patients in Finland. In particular, to find out, if the nursing or nutritional care factors are associated with the nutritional status, and how do carers and nurses recognize malnutrition. A further aim was to assess the energy and nutrient intake of the residents of dementia wards. A final objective was to find out, if the nutrition training of professionals leads to changes in their knowledge and further translate into better nutrition for the aged residents of dementia wards. Subjects and methods: The residents (n=2114) and patients (n=1043) nutritional status was assessed in all studies using the Mini Nutritional Assessment test (MNA). Information was gathered in a questionnaire on residents and patients daily routines providing nutritional care. Residents energy and nutrient intake (n=23; n=21) in dementia wards were determined over three days by the precise weighing method. Constructive learning theory was the basis for educating the professionals (n=28). A half-structured questionnaire was used to assess professionals learning. Studies I-IV were cross-sectional studies whereas study V was an intervention study. Results: Malnutrition was common among elderly residents and patients living in nursing homes and hospitals in Finland. According to the MNA, 11% to 57% of the studied elderly people suffered from malnutrition, and 40-89% were at risk of malnutrition, whereas only 0-16% had a good nutritional status. Resident- and patient-related factors such as dementia, impaired ADL (Activities of Daily Living), swallowing difficulties and constipation mainly explained the malnutrition, but also some nutritional care related factors, such as eating less than half of the offered food portion and not receiving snacks were also related to malnutrition. The intake of energy and some nutrients by the residents of dementia wards were lower than those recommended, although the offered food contained enough energy and nutrients. The proportion of residents receiving vitamin D supplementation was low, although there is a recommendation and known benefits for the adequate intake of vitamin D. Nurses recognized malnutrition poorly, only one in four (26.7%) of the actual cases. Keeping and analysing food diaries and reflecting on nutritional issues in small group discussions were effective training methods for professionals. The nutrition education of professionals had a positive impact on the energy and protein intake, BMIs, and the MNA scores of some residents in dementia wards. Conclusions: Malnutrition was common among elderly residents and patients living in nursing homes and hospitals in Finland. Although residents- and patient related factors mainly explained malnutrition, nurses recognized malnutrition poorly and nutritional care possibilities were in minor use. Professionals nutrition education had a positive impact on the nutrition of elderly residents. Further studies describing successful nutritional care and nutrition education of professionals are needed.
Resumo:
This study analyzes the war-time rations the Finnish soldiers received on the front from 1939 until 1945. The main objective was to determine the contents of the rations and how they affected the soldiers' nutrition and morale. The information concerning food and feeding is mainly based on the official documents found in the Military Archives. Some additional material was from the historical literature, some from memoirs, or from the veterans who personally experienced the front. The documents in the Archives of Military Medicine provided information on the soldiers' deficiencies. During the Winter War, which took place from 30 November 1939 until 13 March 1940, ample food was available. The cold climate caused problems and the fresh food got frozen. However, no severe deficiency cases were reported and the morale was high. By contrast, during the Continuation War, which began in June, 1941 and ended in September, 1944, difficulties were experienced. At the time farming in the country faced serious problems due to the shortage of labour, fuel, etc. Furthermore, importing food was generally not possible. However, importing food mainly from Germany saved the Finns from hunger. In addition, the self activity of the soldiers on the front added somewhat to the food production. But the rations had to be reduced. Their energy values were consequently low, especially for the young men. Food was monotonous and occasionally caused complaints. The main sources of protein, vitamins and minerals were the whole cereal foods. Butter was fortified with vitamin A and vitamin C tablets were also distributed, to compensate for the scant food sources. Only approximately 300 serious deficiency cases required hospital care during the three years time, out of a total of 400 000 soldiers. Feeding the young soldiers during the war (1944 - 1945) in Lapland, which had been destroyed, was problematic but the increased rations also saved them from deficiencies. In spite of the severe difficulties experienced occasionally in feeding the soldiers during the wars, the system worked all the time. The soldiers were fed, the cases of nutritional deficiency and epidemics caused by food were kept very limited and the morale of soldiers remained high.
Resumo:
Nutrition affects bone health throughout life. To optimize peak bone mass development and maintenance, it is important to pay attention to the dietary factors that enhance and impair bone metabolism. In this study, the in vivo effects of inorganic dietary phosphate and the in vitro effects of bioactive tripeptides, IPP, VPP and LKP were investigated. Dietary phosphate intake is increased through the use of convenience foods and soft drinks rich in phosphate-containing food additives. Our results show that increased dietary phosphate intake hinders mineral deposition in cortical bone and diminishes bone mineral density (BMD) in the aged skeleton in a rodent model (Study I). In the growing skeleton (Study II), increased phosphate intake was observed to reduce bone material and structural properties, leading to diminished bone strength. Studies I and II revealed that a low Ca:P ratio has negative effects on the mature and growing rat skeleton even when calcium intake is sufficient. High dietary protein intake is beneficial for bone health. Protein is essential for bone turnover and matrix formation. In addition, hydrolysis of proteins in the gastrointestinal tract produces short peptides that possess a biological function beyond that of being tissue building blocks. The effects of three bioactive tripeptides, IPP, VPP and LKP, were assessed in short- and long-term in vitro experiments. Short-term treatment (24 h) with tripeptide IPP, VPP or LKP influenced osteoblast gene expression (Study III). IPP in particular, regulates genes associated with cell differentiation, cell growth and cell signal transduction. The upregulation of these genes indicates that IPP enhances osteoblast proliferation and differentiation. Long-term treatment with IPP enhanced osteoblast gene expression in favour of bone formation and increased mineralization (Study IV). The in vivo effects of IPP on osteoblast differentiation might differ since eating frequency drives food consumption, and protein degradation products, such as bioactive peptides, are available periodically, not continuously as in this study. To sum up, Studies I and II raise concern about the appropriate amount of dietary phosphate to support bone health as excess is harmful. Studies III and IV in turn, support findings of the beneficial effects of dietary protein on bone and provide a mechanistic explanation since cell proliferation and osteoblast function were improved by treatment with bioactive tripeptide IPP.
Resumo:
Cardiovascular diseases (CVDs) are the leading cause of mortality in the world. Studies of the impact of single nutrients on the risk for CVD have often provided inconclusive results, and recent research in nutritional epidemiology with a more holistic whole-diet approach has proven fruitful. Moreover, dietary habits in childhood and adolescence may play a role in later health and disease, either independently or by tracking into adulthood. The main aims of this study were to find childhood and adulthood determinants of adulthood diet, to identify dietary patterns present among the study population and to study the associations between long-term food choices and cardiovascular health in young Finnish adults. The study is a part of the multidisciplinary Cardiovascular Risk in Young Finns study, which is an ongoing, prospective cohort study with a 21-year follow-up. At baseline in 1980, the subjects were children and adolescents aged 3 to 18 years (n included in this study = 1768), and young adults aged 24 to 39 years at the latest follow-up study in 2001 (n = 1037). Food consumption and nutrient intakes were assessed with repeated 48-hour dietary recalls. Other determinations have included comprehensive risk factor assessments using blood tests, physical measurements and questionnaires. In the latest follow-up, ultrasound examinations were performed to study early atherosclerotic vascular changes. The average intakes showed substantial changes since 1980. Intakes of fat and saturated fat had decreased, whereas the consumption of fruits and vegetables had increased. Intake of fat and consumption of vegetables in childhood and physical activity in adulthood were important health behavioural determinants of adult diet. Additionally, a principal component analysis was conducted to identify major dietary patterns at each study point. A similar set of two major patterns was recognised throughout the study. The traditional dietary pattern positively correlated with the consumption of traditional Finnish foods, such as rye, potatoes, milk, butter, sausages and coffee, and negatively correlated with fruit, berries and dairy products other than milk. This type of diet was independently associated with several risk factors of CVD, such as total and low-density lipoprotein cholesterol, apolipoprotein B and C-reactive protein concentrations among both genders, as well as with systolic blood pressure and insulin levels among women. The traditional pattern was also independently associated with intima media thickness (IMT), a subclinical predictor of CVD, in men but not in women. The health-conscious pattern, predominant among female subjects, non-smokers and urbanites, was characterised by more health-conscious food choices such as vegetables, legumes and nuts, tea, rye, fish, cheese and other dairy products, as well as by the consumption of alcoholic beverages. This pattern was inversely, but less strongly, associated with cardiovascular risk factors. Tracking of the dietary pattern scores was observed, particularly among subjects who were adolescents at baseline. Moreover, a long-term high intake of protein concurrent with a low intake of fat was positively associated with IMT. These findings suggest that food behaviour and food choices are to some extent established as early as in childhood or adolescence and may significantly track into adulthood. Long-term adherence to traditional food choices seems to increase the risk for developing CVD, especially among men. Those with intentional or unintentional low fat diets, but with high intake of protein may also be at increased risk for CVD. The findings offer practical, food-based information on the relationship between diet and CVD and encourage further use of the whole-diet approach in epidemiological research. The results support earlier findings that long-term food choices play a role in the development of CVD. The apparent influence of childhood habits is important to bear in mind when planning educational strategies for the primary prevention of CVD. Further studies on food choices over the entire lifespan are needed.
Resumo:
The low solubility of iron (Fe) depresses plant growth in calcareous soils. In order to improve Fe availability, calcareous soils are treated with synthetic ligands, such as ethylenediaminetetraacetic acid (EDTA) and ethylenediimi-nobis(2-hydroxyphenyl)acetic acid (EDDHA). However, high expenses may hinder their use (EDDHA), and the recalcitrance of EDTA against biodegra-dation may increase the potential of cadmium (Cd) and lead (Pb) leaching. This study evaluated the ability of biodegradable ligands, i.e. different stereo-isomers of ethylenediaminedisuccinic acid (EDDS), to provide Fe for lettuce (Lactuca sativa L.) and ryegrass (Lolium perenne cv. Prego), their effects on uptake of other elements and solubility in soils and their subsequent effects on the activity of oxygen-scavenging enzymes in lettuce. Both EDTA and EDDHA were used as reference ligands. In unlimed and limed quartz sand both FeEDDS(S,S) and a mixture of stereo-isomers of FeEDDS (25% [S,S]-EDDS, 25% [R,R]-EDDS and 50% [S,R]/[R,S]-EDDS), FeEDDS(mix), were as efficient as FeEDTA and FeEDDHA in providing lettuce with Fe. However, in calcareous soils only FeEDDS(mix) was comparable to FeEDDHA when Fe was applied twice a week to mimic drip irrigation. The Fe deficiency increased the manganese (Mn) concentration in lettuce in both acidic and alkaline growth media, whereas Fe chelates depressed it. The same was observed with zinc (Zn) and copper (Cu) in acidic growth media. EDDHA probably affected the hormonal status of lettuce as well and thus depressed the uptake of Zn and Mn even more. The nutrient concentrations of ryegrass were only slightly affected by the Fe availability. After Fe chelate splitting in calcareous soils, EDDS and EDTA increased the solubility of Zn and Cu most, but only the Zn concentration was increased in lettuce. The availability of Fe increased the activity of oxygen-scavenging enzymes (ascorbate peroxidase, guaiacol peroxidase, catalase). The activity of Cu/ZnSOD (Cu/Zn superoxide dismutase) and MnSOD in lettuce leaves followed the concentrations of Zn and Mn. In acidic quartz sand low avail-ability of Fe increased the cobalt (Co) and nickel (Ni) concentrations in let-tuce, but Fe chelates decreased them. EDTA increased the solubility of Cd and Pb in calcareous soils, but not their uptake. The biodegradation of EDDS was not affected by the complexed element, and [S,S]-EDDS was biodegraded within 28 days in calcareous soils. EDDS(mix) was more recalcitrant, and after 56 days of incubation water-soluble elements (Fe, Mn, Zn, Cu, Co, Ni, Cd and Pb) corresponded to 10% of the added EDDS(mix) concentration.
Resumo:
Heredity explains a major part of the variation in calcium homeostasis and bone strength, and the susceptibility to osteoporosis is polygenetically regulated. Bone phenotype results from the interplay between lifestyle and genes, and several nutritional factors modulate bone health throughout life. Thus, nutrigenetics examining the genetic variation in nutrient intake and homeostatic control is an important research area in the etiology of osteoporosis. Despite continuing progress in the search for candidate genes for osteoporosis, the results thus far have been inconclusive. The main objective of this thesis was to investigate the associations of lactase, vitamin D receptor (VDR), calcium sensing receptor (CaSR) and parathyroid hormone (PTH) gene polymorphisms and lifestyle factors and their interactions with bone health in Finns at varying stages of the skeletal life span. Markers of calcium homeostasis and bone remodelling were measured from blood and urine samples. Bone strength was measured at peripheral and central bone sites. Lifestyle factors were assessed with questionnaires and interviews. Genetic lactase non-persistence (the C/C-13910 genotype) was associated with lower consumption of milk from childhood, predisposing females in particular to inadequate calcium intake. Consumption of low-lactose milk and milk products was shown to decrease the risk for inadequate calcium intake. In young adulthood, bone loss was more common in males than in females. Males with the lactase C/C-13910 genotype may be more susceptible to bone loss than males with the other lactase genotypes, although calcium intake predicts changes in bone mass more than the lactase genotype. The BsmI and FokI polymorphisms of the VDR gene were associated with bone mass in growing adolescents, but the associations weakened with age. In young adults, the A986S polymorphism of the calcium sensing receptor gene was associated with serum ionized calcium concentrations, and the BstBI polymorphism of the parathyroid gene was related to bone strength. The FokI polymorphism and sodium intake showed an interaction effect on urinary calcium excretion. A novel gene-gene interaction between the VDR FokI and PTH BstBI gene polymorphisms was found in the regulation of PTH secretion and urinary calcium excretion. Further research should be carried out with more number of Finns at varying stages of the skeletal life span and more detailed measurements of bone strength. Research should concern mechanisms by which genetic variants affect calcium homeostasis and bone strength, and the role of diet-gene and gene-gene interactions in the pathogenesis of osteoporosis.
Resumo:
In Finland, peat harvesting sites are utilized down almost to the mineral soil. In this situation the properties of mineral subsoil are likely to have considerable influence on the suitability for the various after-use forms. The aims of this study were to recognize the chemical and physical properties of mineral subsoils possibly limiting the after-use of cut-over peatlands, to define a minimum practice for mineral subsoil studies and to describe the role of different geological areas. The future percentages of the different after-use forms were predicted, which made it possible to predict also carbon accumulation in this future situation. Mineral subsoils of 54 different peat production areas were studied. Their general features and grain size distribution was analysed. Other general items studied were pH, electrical conductivity, organic matter, water soluble nutrients (P, NO3-N, NH4-N, S and Fe) and exchangeable nutrients (Ca, Mg and K). In some cases also other elements were analysed. In an additional case study carbon accumulation effectiveness before the intervention was evaluated on three sites in Oulu area (representing sites typically considered for peat production). Areas with relatively sulphur rich mineral subsoil and pool-forming areas with very fine and compact mineral subsoil together covered approximately 1/5 of all areas. These areas were unsuitable for commercial use. They were recommended for example for mire regeneration. Another approximate 1/5 of the areas included very coarse or very fine sediments. Commercial use of these areas would demand special techniques - like using the remaining peat layer for compensating properties missing from the mineral subsoil. One after-use form was seldom suitable for one whole released peat production area. Three typical distribution patterns (models) of different mineral subsoils within individual peatlands were found. 57 % of studied cut-over peatlands were well suited for forestry. In a conservative calculation 26% of the areas were clearly suitable for agriculture, horticulture or energy crop production. If till without large boulders was included, the percentage of areas suitable to field crop production would be 42 %. 9-14 % of all areas were well suitable for mire regeneration or bird sanctuaries, but all areas were considered possible for mire regeneration with correct techniques. Also another 11 % was recommended for mire regeneration to avoid disturbing the mineral subsoil, so total 20-25 % of the areas would be used for rewetting. High sulphur concentrations and acidity were typical to the areas below the highest shoreline of the ancient Litorina sea and Lake Ladoga Bothnian Bay zone. Also differences related to nutrition were detected. In coarse sediments natural nutrient concentration was clearly higher in Lake Ladoga Bothnian Bay zone and in the areas of Svecokarelian schists and gneisses, than in Granitoid area of central Finland and in Archaean gneiss areas. Based on this study the recommended minimum analysis for after-use planning was for pH, sulphur content and fine material (<0.06 mm) percentage. Nutrition capacity could be analysed using the natural concentrations of calcium, magnesium and potassium. Carbon accumulation scenarios were developed based on the land-use predictions. These scenarios were calculated for areas in peat production and the areas released from peat production (59300 ha + 15 671 ha). Carbon accumulation of the scenarios varied between 0.074 and 0.152 million t C a-1. In the three peatlands considered for peat production the long term carbon accumulation rates varied between 13 and 24 g C m-2 a-1. The natural annual carbon accumulation had been decreasing towards the time of possible intervention.
Resumo:
The first aim of the current study was to evaluate the survival of total hip arthroplasty (THA) in patients aged 55 years and older on a nation-wide level. The second aim was to evaluate, on a nation wide-basis, the geographical variation of the incidence of primary THA for primary OA and also to identify those variables that are possibly associated with this variation. The third aim was to evaluate the effects of hospital volume: on the length of stay, on the numbers of re-admissions and on the numbers of complications of THR on population-based level in Finland. The survival of implants was analysed based on data from the Finnish Arthroplasty Register. The incidence and hospital volume data were obtained from the Hospital Discharge Register. Cementless total hip replacements had a significantly reduced risk of revision for aseptic loosening compared with cemented hip replacements. When revision for any reason was the end point in the survival analyses, there were no significant differences found between the groups. Adjusted incidence ratios of THA varied from 1.9- to 3.0-fold during the study period. Neither the average income within a region nor the morbidity index was associated with the incidence of THA. For the four categories of volume of total hip replacements performed per hospital, the length of the surgical treatment period was shorter for the highest volume group than for the lowest volume group. The odds ratio for dislocations was significantly lower in the high volume group than in the low volume group. In patients who were 55 years of age or older, the survival of cementless total hip replacements was as good as that of the cemented replacements. However, multiple wear-related revisions of the cementless cups indicate that excessive polyethylene wear was a major clinical problem with modular cementless cups. The variation in the long-term rates of survival for different cemented stems was considerable. Cementless proximal porous-coated stems were found to be a good option for elderly patients. When hip surgery was performed on with a large repertoire, the indications to perform THAs due to primary OA were tight. Socio-economic status of the patient had no apparent effect on THA rate. Specialization of hip replacements in high volume hospitals should reduce costs by significantly shortening the length of stay, and may reduce the dislocation rate.