101 resultados para Time to surgery
Resumo:
Messenger RNAs (mRNAs) can be repressed and degraded by small non-coding RNA molecules. In this paper, we formulate a coarsegrained Markov-chain description of the post-transcriptional regulation of mRNAs by either small interfering RNAs (siRNAs) or microRNAs (miRNAs). We calculate the probability of an mRNA escaping from its domain before it is repressed by siRNAs/miRNAs via cal- culation of the mean time to threshold: when the number of bound siRNAs/miRNAs exceeds a certain threshold value, the mRNA is irreversibly repressed. In some cases,the analysis can be reduced to counting certain paths in a reduced Markov model. We obtain explicit expressions when the small RNA bind irreversibly to the mRNA and we also discuss the reversible binding case. We apply our models to the study of RNA interference in the nucleus, examining the probability of mRNAs escaping via small nuclear pores before being degraded by siRNAs. Using the same modelling framework, we further investigate the effect of small, decoy RNAs (decoys) on the process of post-transcriptional regulation, by studying regulation of the tumor suppressor gene, PTEN : decoys are able to block binding sites on PTEN mRNAs, thereby educing the number of sites available to siRNAs/miRNAs and helping to protect it from repression. We calculate the probability of a cytoplasmic PTEN mRNA translocating to the endoplasmic reticulum before being repressed by miRNAs. We support our results with stochastic simulations
Resumo:
Measures of transit accessibility are important in evaluating transit services, planning for future services and investment on land use development. Existing tools measure transit accessibility using averaged walking distance or walking time to public transit. Although the mode captivity may have significant implications on one’s willingness to walk to use public transit, this has not been addressed in the literature to date. Failed to distinguish transit captive users may lead to overestimated ridership and spatial coverage of transit services. The aim of this research is to integrate the concept of transit captivity into the analysis of walking access to public transit. The conventional way of defining “captive” and “choice” transit users showed no significant difference in their walking times according to a preliminary analysis. A cluster analysis technique is used to further divide “choice” users by three main factors, namely age group, labour force status and personal income. After eliminating “true captive” users, defined as those without driver’s licence or without a car in respective household, “non-true captive” users were classified into a total of eight groups having similar socio-economic characteristics. The analysis revealed significant differences in the walking times and patterns by their level of captivity to public transit. This paper challenges the rule-of-thumb of 400m walking distance to bus stops. In average, people’s willingness to walk dropped drastically at 268m and continued to drop constantly until it reached the mark of 670m, where there was another drastic drop of 17%, which left with only 10% of the total bus riders willing to walk 670m or more. This research found that mothers working part time were the ones with lowest transit captivity and thus most sensitive to the walking time, followed by high-income earners and the elderly. The level of captivity increases when public transit users earned lesser income, such as students and students working part time.
Resumo:
This article provides a review of techniques for the analysis of survival data arising from respiratory health studies. Popular techniques such as the Kaplan–Meier survival plot and the Cox proportional hazards model are presented and illustrated using data from a lung cancer study. Advanced issues are also discussed, including parametric proportional hazards models, accelerated failure time models, time-varying explanatory variables, simultaneous analysis of multiple types of outcome events and the restricted mean survival time, a novel measure of the effect of treatment.
Resumo:
In contrast to the well-known Charcot neuroarthropathy (CN) of the foot, CN of the knee is hardly recognized. In a literature search, we only found five articles on total knee arthroplasty for Charcot joints (1–5). We did not find a single article dealing with alternative treatment options or the general clinical course of this disease. We started our study
Resumo:
Prior to embarking on further study into the subject of relevance it is essential to consider why the concept of relevance has remained inconclusive, despite extensive research and its centrality to the discipline of information science. The approach taken in this paper is to reconstruct the science of information retrieval from first principles including the problem statement, role, scope and objective. This framework for document selection is put forward as a straw man for comparison with the historical relevance models. The paper examines five influential relevance models over the past 50 years. Each is examined with respect to its treatment of relevance and compared with the first principles model to identify contributions and deficiencies. The major conclusion drawn is that relevance is a significantly overloaded concept which is both confusing and detrimental to the science.
Resumo:
In 2015, Victoria passed laws removing the time limit in which a survivor of child sexual abuse can commence a civil claim for personal injury. The law applies also to physical abuse, and to psychological injury arising from those forms of abuse. In 2016, New South Wales made almost identical legal reforms. These reforms were partly motivated by the recommendations of inquiries into institutional child abuse. Of particular relevance is that the Australian Royal Commission Into Institutional Responses to Child Sexual Abuse recommended in 2015 that all States and Territories remove their time limits for civil claims. This presentation explores the problems with standard time limits when applied to child sexual abuse cases (whether occurring within or beyond institutions), the scientific, ethical and legal justifications for lifting the time limits, and solutions for future law reform.
Resumo:
Background Studies investigating the relationship between malnutrition and post-discharge mortality following acute hip fracture yield conflicting results. This study aimed to determine whether malnutrition independently predicted 12-month post-fracture mortality after adjusting for clinically relevant covariates. Methods An ethics approved, prospective, consecutive audit was undertaken for all surgically treated hip fracture inpatients admitted to a dedicated orthogeriatric unit (November 2010–October 2011). The 12-month mortality data were obtained by a dual search of the mortality registry and Queensland Health database. Malnutrition was evaluated using the Subjective Global Assessment. Demographic (age, gender, admission residence) and clinical covariates included fracture type, time to surgery, anaesthesia type, type of surgery, post-surgery time to mobilize and post-operative complications (delirium, pulmonary and deep vein thrombosis, cardiac complications, infections). The Charlson Comorbidity Index was retrospectively applied. All diagnoses were confirmed by the treating orthogeriatrician. Results A total of 322 of 346 patients were available for audit. Increased age (P = 0.004), admission from residential care (P < 0.001), Charlson Comorbidity Index (P = 0.007), malnutrition (P < 0.001), time to mobilize >48 h (P < 0.001), delirium (P = 0.003), pulmonary embolism (P = 0.029) and cardiovascular complication (P = 0.04) were associated with 12-month mortality. Logistic regression analysis demonstrated that malnutrition (odds ratio (OR) 2.4 (95% confidence interval (CI) 1.3–4.7, P = 0.007)), in addition to admission from residential care (OR 2.6 (95% CI 1.3–5.3, P = 0.005)) and pulmonary embolism (OR 11.0 (95% CI 1.5–78.7, P = 0.017)), independently predicted 12-month mortality. Conclusions Findings substantiate malnutrition as an independent predictor of 12-month mortality in a representative sample of hip fracture inpatients. Effective strategies to identify and treat malnutrition in hip fracture should be prioritized.
Resumo:
Goals: Few studies have repeatedly evaluated quality of life and potentially relevant factors in patients with benign primary brain tumor. The purpose of this study was to explore the relationship between the experience of the symptom distress, functional status, depression, and quality of life prior to surgery (T1) and 1 month post-discharge (T2). ---------- Patients and methods: This was a prospective cohort study including 58 patients with benign primary brain tumor in one teaching hospital in the Taipei area of Taiwan. The research instruments included the M.D. Anderson Symptom Inventory, the Functional Independence Measure scale, the Hospital Depression Scale, and the Functional Assessment of Cancer Therapy-Brain.---------- Results: Symptom distress (T1: r=−0.90, p<0.01; T2: r=−0.52, p<0.01), functional status (T1: r=0.56, p<0.01), and depression (T1: r=−0.71, p<0.01) demonstrated a significant relationship with patients' quality of life. Multivariate analysis identified symptom distress (explained 80.2%, Rinc 2=0.802, p=0.001) and depression (explained 5.2%, Rinc 2=0.052, p<0.001) continued to have a significant independent influence on quality of life prior to surgery (T1) after controlling for key demographic and medical variables. Furthermore, only symptom distress (explained 27.1%, Rinc 2=0.271, p=0.001) continued to have a significant independent influence on quality of life at 1 month after discharge (T2).---------- Conclusions: The study highlights the potential importance of a patient's symptom distress on quality of life prior to and following surgery. Health professionals should inquire about symptom distress over time. Specific interventions for symptoms may improve the symptom impact on quality of life. Additional studies should evaluate symptom distress on longer-term quality of life of patients with benign brain tumor.
Resumo:
A hospital consists of a number of wards, units and departments that provide a variety of medical services and interact on a day-to-day basis. Nearly every department within a hospital schedules patients for the operating theatre (OT) and most wards receive patients from the OT following post-operative recovery. Because of the interrelationships between units, disruptions and cancellations within the OT can have a flow-on effect to the rest of the hospital. This often results in dissatisfied patients, nurses and doctors, escalating waiting lists, inefficient resource usage and undesirable waiting times. The objective of this study is to use Operational Research methodologies to enhance the performance of the operating theatre by improving elective patient planning using robust scheduling and improving the overall responsiveness to emergency patients by solving the disruption management and rescheduling problem. OT scheduling considers two types of patients: elective and emergency. Elective patients are selected from a waiting list and scheduled in advance based on resource availability and a set of objectives. This type of scheduling is referred to as ‘offline scheduling’. Disruptions to this schedule can occur for various reasons including variations in length of treatment, equipment restrictions or breakdown, unforeseen delays and the arrival of emergency patients, which may compete for resources. Emergency patients consist of acute patients requiring surgical intervention or in-patients whose conditions have deteriorated. These may or may not be urgent and are triaged accordingly. Most hospitals reserve theatres for emergency cases, but when these or other resources are unavailable, disruptions to the elective schedule result, such as delays in surgery start time, elective surgery cancellations or transfers to another institution. Scheduling of emergency patients and the handling of schedule disruptions is an ‘online’ process typically handled by OT staff. This means that decisions are made ‘on the spot’ in a ‘real-time’ environment. There are three key stages to this study: (1) Analyse the performance of the operating theatre department using simulation. Simulation is used as a decision support tool and involves changing system parameters and elective scheduling policies and observing the effect on the system’s performance measures; (2) Improve viability of elective schedules making offline schedules more robust to differences between expected treatment times and actual treatment times, using robust scheduling techniques. This will improve the access to care and the responsiveness to emergency patients; (3) Address the disruption management and rescheduling problem (which incorporates emergency arrivals) using innovative robust reactive scheduling techniques. The robust schedule will form the baseline schedule for the online robust reactive scheduling model.
Resumo:
It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.
Resumo:
This paper discusses findings made during a study of energy use feedback in the home (eco-feedback), well after the novelty has worn off. Contributing towards four important knowledge gaps in the research, we explore eco-feedback over longer time scales, focusing on instances where the feedback was not of lasting benefit to users rather than when it was. Drawing from 23 semi-structured interviews with Australian householders, we found that an initially high level of engagement gave way over time to disinterest, neglect and in certain cases, technical malfunction. Additionally, preconceptions concerned with the “purpose” of the feedback were found to affect use. We propose expanding the scope of enquiry for eco-feedback in several ways, and describe how eco-feedback that better supports decision-making in the “maintenance phase”, i.e. once the initial novelty has worn off, may be key to longer term engagement.
Resumo:
This research utilised software developed for managing the Australian sugar industry's cane rail transport operations and GPS data used to track locomotives to ensure safe operation of the railway system to improve transport operations. As a result, time usage in the sugarcane railway can now be summarised and locomotive arrival time to sidings and mills can be predicted. This information will help the development of more efficient run schedules and enable mill staff and harvesters to better plan their shifts ahead, enabling cost reductions through better use of available time.
Resumo:
PURPOSE To review records of 330 patients who underwent surgery for femoral neck fractures with or without preoperative anticoagulation therapy. METHODS Medical records of 235 women and 95 men aged 48 to 103 years (mean, 81.6; standard deviation [SD], 13.1) who underwent surgery for femoral neck fractures with or without preoperative anticoagulation therapy were reviewed. 30 patients were on warfarin, 105 on aspirin, 28 on clopidogrel, and 167 were controls. The latter 3 groups were combined as the non-warfarin group and compared with the warfarin group. Hospital mortality, time from admission to surgery, length of hospital stay, return to theatre, and postoperative complications (wound infection, deep vein thrombosis, and pulmonary embolism) were assessed. RESULTS The warfarin and control groups were significantly younger than the clopidogrel and aspirin groups (80.8 vs. 80.0 vs. 84.2 vs. 83.7 years, respectively, p<0.05). 81% of the patients underwent surgery within 48 hours of admission. The overall mean time from admission to surgery was 1.8 days; it was longer in the warfarin than the aspirin, clopidogrel, and control groups (3.3 vs. 1.8 vs. 1.6 vs. 1.6 days, respectively, p<0.001). The mean length of hospital stay was 17.5 (SD, 9.6; range, 3-54) days. The overall hospital mortality was 3.9%; it was 6.7% in the warfarin group, 3.8% in the aspirin group, 3.6% in the clopidogrel group, and 3.6% in the control group (p=0.80). Four patients returned to theatre for surgery: one in the warfarin group for washout of a haematoma, 2 in the aspirin group for repositioning of a mal-fixation and for debridement of wound infection, and one in the control group for debridement of wound infection. The warfarin group did not differ significantly from non-warfarin group in terms of postoperative complication rate (6.7% vs. 2.7%, p=0.228) and the rate of return to theatre (3.3% vs. 1%, p=0.318). CONCLUSION It is safe to continue aspirin and clopidogrel prior to surgical treatment for femoral neck fracture. The risk of delaying surgery outweighs the peri-operative bleeding risk.
Resumo:
Background The incidence of obesity amongst patients presenting for elective Total Hip Arthroplasty (THA) has increased in the last decade and the relationship between obesity and the need for joint replacement has been demonstrated. This study evaluates the effects of morbid obesity on outcomes following primary THA by comparing short-term outcomes in THA between a morbidly obese (BMI ≥40) and a normal weight (BMI 18.5 - <25) cohort at our institution between January 2003 and December 2010. Methods Thirty-nine patients included in the morbidly obese group were compared with 186 in the normal weight group. Operative time, length of stay, complications, readmission and length of readmission were compared. Results Operative time was increased in the morbidly obese group at 122 minutes compared with 100 minutes (p=0.002). Post-operatively there was an increased 30-day readmission rate related to surgery of 12.8% associated with BMI ≥40 compared with 2.7% (p= 0.005) as well as a 5.1 fold increase in surgery related readmitted bed days - 0.32 bed days per patient for normal weight compared with 1.64 per patient for the morbidly obese (p=0.026). Conclusion Morbidly obese patients present a technical challenge and likely this and the resultant complications are underestimated. More work needs to be performed in order to enable suitable allocation of resources.
Resumo:
There is a growing evidence-base in the epidemiological literature that demonstrates significant associations between people’s living circumstances – including their place of residence – and their health-related practices and outcomes (Leslie, 2005; Karpati, Bassett, & McCord, 2006; Monden, Van Lenthe, & Mackenbach, 2006; Parkes & Kearns, 2006; Cummins, Curtis, Diez-Roux, & Macintyre, 2007; Turrell, Kavanagh, Draper, & Subramanian, 2007). However, these findings raise questions about the ways in which living places, such as households and neighbourhoods, figure in the pathways connecting people and health (Frolich, Potvin, Chabot, & Corin, 2002; Giles-Corti, 2006; Brown et al, 2006; Diez Roux, 2007). This thesis addressed these questions via a mixed methods investigation of the patterns and processes connecting people, place, and their propensity to be physically active. Specifically, the research in this thesis examines a group of lower-socioeconomic residents who had recently relocated from poorer suburbs to a new urban village with a range of health-related resources. Importantly, the study contrasts their historical relationship with physical activity with their reactions to, and everyday practices in, a new urban setting designed to encourage pedestrian mobility and autonomy. The study applies a phenomenological approach to understanding living contexts based on Berger and Luckman’s (1966) conceptual framework in The Social Construction of Reality. This framework enables a questioning of the concept of context itself, and a treatment of it beyond environmental factors to the processes via which experiences and interactions are made meaningful. This approach makes reference to people’s histories, habituations, and dispositions in an exploration between social contexts and human behaviour. This framework for thinking about context is used to generate an empirical focus on the ways in which this residential group interacts with various living contexts over time to create a particular construction of physical activity in their lives. A methodological approach suited to this thinking was found in Charmaz’s (1996; 2001; 2006) adoption of a social constructionist approach to grounded theory. This approach enabled a focus on people’s own constructions and versions of their experiences through a rigorous inductive method, which provided a systematic strategy for identifying patterns in the data. The findings of the study point to factors such as ‘childhood abuse and neglect’, ‘early homelessness’, ‘fear and mistrust’, ‘staying indoors and keeping to yourself’, ‘conflict and violence’, and ‘feeling fat and ugly’ as contributors to an ongoing core category of ‘identity management’, which mediates the relationship between participants’ living contexts and their physical activity levels. It identifies barriers at the individual, neighbourhood, and broader ecological levels that prevent this residential group from being more physically active, and which contribute to the ways in which they think about, or conceptualise, this health-related behaviour in relationship to their identity and sense of place – both geographic and societal. The challenges of living well and staying active in poorer neighbourhoods and in places where poverty is concentrated were highlighted in detail by participants. Participants’ reactions to the new urban neighbourhood, and the depth of their engagement with the resources present, are revealed in the context of their previous life-experiences with both living places and physical activity. Moreover, an understanding of context as participants’ psychological constructions of various social and living situations based on prior experience, attitudes, and beliefs was formulated with implications for how the relationship between socioeconomic contextual effects on health are studied in the future. More detailed findings are presented in three published papers with implications for health promotion, urban design, and health inequalities research. This thesis makes a substantive, conceptual, and methodological contribution to future research efforts interested in how physical activity is conceptualised and constructed within lower socioeconomic living contexts, and why this is. The data that was collected and analysed for this PhD generates knowledge about the psychosocial processes and mechanisms behind the patterns observed in epidemiological research regarding socioeconomic health inequalities. Further, it highlights the ways in which lower socioeconomic living contexts tend to shape dispositions, attitudes, and lifestyles, ultimately resulting in worse health and life chances for those who occupy them.