900 resultados para Parametric VaR (Value-at-Risk)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies the field of asset price bubbles. It is comprised of three independent chapters. Each of these chapters either directly or indirectly analyse the existence or implications of asset price bubbles. The type of bubbles assumed in each of these chapters is consistent with rational expectations. Thus, the kind of price bubbles investigated here are known as rational bubbles in the literature. The following describes the three chapters. Chapter 1: This chapter attempts to explain the recent US housing price bubble by developing a heterogeneous agent endowment economy asset pricing model with risky housing, endogenous collateral and defaults. Investment in housing is subject to an idiosyncratic risk and some mortgages are defaulted in equilibrium. We analytically derive the leverage or the endogenous loan to value ratio. This variable comes from a limited participation constraint in a one period mortgage contract with monitoring costs. Our results show that low values of housing investment risk produces a credit easing effect encouraging excess leverage and generates credit driven rational price bubbles in the housing good. Conversely, high values of housing investment risk produces a credit crunch characterized by tight borrowing constraints, low leverage and low house prices. Furthermore, the leverage ratio was found to be procyclical and the rate of defaults countercyclical consistent with empirical evidence. Chapter 2: It is widely believed that financial assets have considerable persistence and are susceptible to bubbles. However, identification of this persistence and potential bubbles is not straightforward. This chapter tests for price bubbles in the United States housing market accounting for long memory and structural breaks. The intuition is that the presence of long memory negates price bubbles while the presence of breaks could artificially induce bubble behaviour. Hence, we use procedures namely semi-parametric Whittle and parametric ARFIMA procedures that are consistent for a variety of residual biases to estimate the value of the long memory parameter, d, of the log rent-price ratio. We find that the semi-parametric estimation procedures robust to non-normality and heteroskedasticity errors found far more bubble regions than parametric ones. A structural break was identified in the mean and trend of all the series which when accounted for removed bubble behaviour in a number of regions. Importantly, the United States housing market showed evidence for rational bubbles at both the aggregate and regional levels. In the third and final chapter, we attempt to answer the following question: To what extend should individuals participate in the stock market and hold risky assets over their lifecycle? We answer this question by employing a lifecycle consumption-portfolio choice model with housing, labour income and time varying predictable returns where the agents are constrained in the level of their borrowing. We first analytically characterize and then numerically solve for the optimal asset allocation on the risky asset comparing the return predictability case with that of IID returns. We successfully resolve the puzzles and find equity holding and participation rates close to the data. We also find that return predictability substantially alter both the level of risky portfolio allocation and the rate of stock market participation. High factor (dividend-price ratio) realization and high persistence of factor process indicative of stock market bubbles raise the amount of wealth invested in risky assets and the level of stock market participation, respectively. Conversely, rare disasters were found to bring down these rates, the change being severe for investors in the later years of the life-cycle. Furthermore, investors following time varying returns (return predictability) hedged background risks significantly better than the IID ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The aim of this study was the evaluation of a fast Gradient Spin Echo Technique (GraSE) for cardiac T2-mapping, combining a robust estimation of T2 relaxation times with short acquisition times. The sequence was compared against two previously introduced T2-mapping techniques in a phantom and in vivo. Methods: Phantom experiments were performed at 1.5 T using a commercially available cylindrical gel phantom. Three different T2-mapping techniques were compared: a Multi Echo Spin Echo (MESE; serving as a reference), a T2-prepared balanced Steady State Free Precession (T2prep) and a Gradient Spin Echo sequence. For the subsequent in vivo study, 12 healthy volunteers were examined on a clinical 1.5 T scanner. The three T2-mapping sequences were performed at three short-axis slices. Global myocardial T2 relaxation times were calculated and statistical analysis was performed. For assessment of pixel-by-pixel homogeneity, the number of segments showing an inhomogeneous T2 value distribution, as defined by a pixel SD exceeding 20 % of the corresponding observed T2 time, was counted. Results: Phantom experiments showed a greater difference of measured T2 values between T2prep and MESE than between GraSE and MESE, especially for species with low T1 values. Both, GraSE and T2prep resulted in an overestimation of T2 times compared to MESE. In vivo, significant differences between mean T2 times were observed. In general, T2prep resulted in lowest (52.4 +/- 2.8 ms) and GraSE in highest T2 estimates (59.3 +/- 4.0 ms). Analysis of pixel-by-pixel homogeneity revealed the least number of segments with inhomogeneous T2 distribution for GraSE-derived T2 maps. Conclusions: The GraSE sequence is a fast and robust sequence, combining advantages of both MESE and T2prep techniques, which promises to enable improved clinical applicability of T2-mapping in the future. Our study revealed significant differences of derived mean T2 values when applying different sequence designs. Therefore, a systematic comparison of different cardiac T2-mapping sequences and the establishment of dedicated reference values should be the goal of future studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O estudo da estrutura e dinâmica da regeneração natural em sub-bosque de plantios com espécies exóticas, como as do gênero Pinus , possibilita dar informações para manejo, conservação e reestabelecimento das espécies nativas de uma comunidade vegetal. O objetivo deste trabalho foi identificar e quantificar a dinâmica da regeneração natural das espécies arbustivo-arbóreas ocorrentes no sub-bosque do povoamento de Pinus caribaea , na Rebio de Saltinho, em Pernambuco. Foram medidas as espécies regenerantes de 10 parcelas permanentes, de 1 x 50 m, e incluídos os indivíduos com circunferência na base a 30 cm do solo (CAB 0,30m) ≤ 15 cm e altura superior a um metro. A altura foi classificada em: Classe 1, indivíduos arbustivoarbóreos, com altura 1 ≤ H ≤ 2; Classe 2 com altura 2 < H ≤ 3; e Classe 3, com altura > 3 m e CAP ≤ 15 cm. Calcularam-se os parâmetros fitossociológicos, a dinâmica da regeneração e os índices de Shannon (H’) e a equabilidade (J’) por Pielou. Protium heptaphyllum teve maior número de indivíduos e valor de importância (VI), e Miconia prasina a melhor frequência nos dois levantamentos. Quanto ao índice H’ de 3,32 nats.ind-1 (2007) passou a 3,07 nats.ind-1 (2012), e a equabilidade de J’ de 0,85 a 0,62, havendo decréscimo tanto para a diversidade, quanto para a distribuição. O levantamento de 2012 registrou aumento de 12,5% do número de indivíduos, e os regenerantes de 2007 tiveram 48,31% de mortalidade. Com relação ao número de indivíduos e área basal, os percentuais de ganhos foram superiores ao das perdas. Conclui-se que a sucessão ecológica da regeneração do sub-bosque do povoamento estudado, encontra-se em modificação positiva, e o povoamento de Pinus caribaea, não está impedindo o surgimento de novos indivíduos e espécies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Receiving personalised feedback on body mass index and other health risk indicators may prompt behaviour change. Few studies have investigated men’s reactions to receiving objective feedback on such measures and detailed information on physical activity and sedentary time. The aim of my research was to understand the meanings different forms of objective feedback have for overweight/obese men, and to explore whether these varied between groups. Participants took part in Football Fans in Training, a gender-sensitised, weight loss programme delivered via Scottish Professional Football Clubs. Semi-structured interviews were conducted with 28 men, purposively sampled from four clubs to investigate the experiences of men who achieved and did not achieve their 5% weight loss target. Data were analysed using the principles of thematic analysis and interpreted through Self-Determination Theory and sociological understandings of masculinity. Several factors were vital in supporting a ‘motivational climate’ in which men could feel ‘at ease’ and adopt self-regulation strategies: the ‘place’ was described as motivating, whereas the ‘people’ (other men ‘like them’; fieldwork staff; community coaches) provided supportive and facilitative roles. Men who achieved greater weight loss were more likely to describe being motivated as a consequence of receiving information on their objective health risk indicators. They continued using self-monitoring technologies after the programme as it was enjoyable; or they had redefined themselves by integrating new-found activities into their lives and no longer relied on external technologies/feedback. They were more likely to see post-programme feedback as confirmation of success, so long as they could fully interpret the information. Men who did not achieve their 5% weight loss reported no longer being motivated to continue their activity levels or self-monitor them with a pedometer. Social support within the programme appeared more important. These men were also less positive about objective post-programme feedback which confirmed their lack of success and had less utility as a motivational tool. Providing different forms of objective feedback to men within an environment that has intrinsic value (e.g. football club setting) and congruent with common cultural constructions of masculinity, appears more conducive to health behaviour change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Depression is a major health problem worldwide and the majority of patients presenting with depressive symptoms are managed in primary care. Current approaches for assessing depressive symptoms in primary care are not accurate in predicting future clinical outcomes, which may potentially lead to over or under treatment. The Allostatic Load (AL) theory suggests that by measuring multi-system biomarker levels as a proxy of measuring multi-system physiological dysregulation, it is possible to identify individuals at risk of having adverse health outcomes at a prodromal stage. Allostatic Index (AI) score, calculated by applying statistical formulations to different multi-system biomarkers, have been associated with depressive symptoms. Aims and Objectives: To test the hypothesis, that a combination of allostatic load (AL) biomarkers will form a predictive algorithm in defining clinically meaningful outcomes in a population of patients presenting with depressive symptoms. The key objectives were: 1. To explore the relationship between various allostatic load biomarkers and prevalence of depressive symptoms in patients, especially in patients diagnosed with three common cardiometabolic diseases (Coronary Heart Disease (CHD), Diabetes and Stroke). 2 To explore whether allostatic load biomarkers predict clinical outcomes in patients with depressive symptoms, especially in patients with three common cardiometabolic diseases (CHD, Diabetes and Stroke). 3 To develop a predictive tool to identify individuals with depressive symptoms at highest risk of adverse clinical outcomes. Methods: Datasets used: ‘DepChron’ was a dataset of 35,537 patients with existing cardiometabolic disease collected as a part of routine clinical practice. ‘Psobid’ was a research data source containing health related information from 666 participants recruited from the general population. The clinical outcomes for 3 both datasets were studied using electronic data linkage to hospital and mortality health records, undertaken by Information Services Division, Scotland. Cross-sectional associations between allostatic load biomarkers calculated at baseline, with clinical severity of depression assessed by a symptom score, were assessed using logistic and linear regression models in both datasets. Cox’s proportional hazards survival analysis models were used to assess the relationship of allostatic load biomarkers at baseline and the risk of adverse physical health outcomes at follow-up, in patients with depressive symptoms. The possibility of interaction between depressive symptoms and allostatic load biomarkers in risk prediction of adverse clinical outcomes was studied using the analysis of variance (ANOVA) test. Finally, the value of constructing a risk scoring scale using patient demographics and allostatic load biomarkers for predicting adverse outcomes in depressed patients was investigated using clinical risk prediction modelling and Area Under Curve (AUC) statistics. Key Results: Literature Review Findings. The literature review showed that twelve blood based peripheral biomarkers were statistically significant in predicting six different clinical outcomes in participants with depressive symptoms. Outcomes related to both mental health (depressive symptoms) and physical health were statistically associated with pre-treatment levels of peripheral biomarkers; however only two studies investigated outcomes related to physical health. Cross-sectional Analysis Findings: In DepChron, dysregulation of individual allostatic biomarkers (mainly cardiometabolic) were found to have a non-linear association with increased probability of co-morbid depressive symptoms (as assessed by Hospital Anxiety and Depression Score HADS-D≥8). A composite AI score constructed using five biomarkers did not lead to any improvement in the observed strength of the association. In Psobid, BMI was found to have a significant cross-sectional association with the probability of depressive symptoms (assessed by General Health Questionnaire GHQ-28≥5). BMI, triglycerides, highly sensitive C - reactive 4 protein (CRP) and High Density Lipoprotein-HDL cholesterol were found to have a significant cross-sectional relationship with the continuous measure of GHQ-28. A composite AI score constructed using 12 biomarkers did not show a significant association with depressive symptoms among Psobid participants. Longitudinal Analysis Findings: In DepChron, three clinical outcomes were studied over four years: all-cause death, all-cause hospital admissions and composite major adverse cardiovascular outcome-MACE (cardiovascular death or admission due to MI/stroke/HF). Presence of depressive symptoms and composite AI score calculated using mainly peripheral cardiometabolic biomarkers was found to have a significant association with all three clinical outcomes over the following four years in DepChron patients. There was no evidence of an interaction between AI score and presence of depressive symptoms in risk prediction of any of the three clinical outcomes. There was a statistically significant interaction noted between SBP and depressive symptoms in risk prediction of major adverse cardiovascular outcome, and also between HbA1c and depressive symptoms in risk prediction of all-cause mortality for patients with diabetes. In Psobid, depressive symptoms (assessed by GHQ-28≥5) did not have a statistically significant association with any of the four outcomes under study at seven years: all cause death, all cause hospital admission, MACE and incidence of new cancer. A composite AI score at baseline had a significant association with the risk of MACE at seven years, after adjusting for confounders. A continuous measure of IL-6 observed at baseline had a significant association with the risk of three clinical outcomes- all-cause mortality, all-cause hospital admissions and major adverse cardiovascular event. Raised total cholesterol at baseline was associated with lower risk of all-cause death at seven years while raised waist hip ratio- WHR at baseline was associated with higher risk of MACE at seven years among Psobid participants. There was no significant interaction between depressive symptoms and peripheral biomarkers (individual or combined) in risk prediction of any of the four clinical outcomes under consideration. Risk Scoring System Development: In the DepChron cohort, a scoring system was constructed based on eight baseline demographic and clinical variables to predict the risk of MACE over four years. The AUC value for the risk scoring system was modest at 56.7% (95% CI 55.6 to 57.5%). In Psobid, it was not possible to perform this analysis due to the low event rate observed for the clinical outcomes. Conclusion: Individual peripheral biomarkers were found to have a cross-sectional association with depressive symptoms both in patients with cardiometabolic disease and middle-aged participants recruited from the general population. AI score calculated with different statistical formulations was of no greater benefit in predicting concurrent depressive symptoms or clinical outcomes at follow-up, over and above its individual constituent biomarkers, in either patient cohort. SBP had a significant interaction with depressive symptoms in predicting cardiovascular events in patients with cardiometabolic disease; HbA1c had a significant interaction with depressive symptoms in predicting all-cause mortality in patients with diabetes. Peripheral biomarkers may have a role in predicting clinical outcomes in patients with depressive symptoms, especially for those with existing cardiometabolic disease, and this merits further investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to estimate the spatial distribution of work accident risk in the informal work market in the urban zone of an industrialized city in southeast Brazil and to examine concomitant effects of age, gender, and type of occupation after controlling for spatial risk variation. The basic methodology adopted was that of a population-based case-control study with particular interest focused on the spatial location of work. Cases were all casual workers in the city suffering work accidents during a one-year period; controls were selected from the source population of casual laborers by systematic random sampling of urban homes. The spatial distribution of work accidents was estimated via a semiparametric generalized additive model with a nonparametric bidimensional spline of the geographical coordinates of cases and controls as the nonlinear spatial component, and including age, gender, and occupation as linear predictive variables in the parametric component. We analyzed 1,918 cases and 2,245 controls between 1/11/2003 and 31/10/2004 in Piracicaba, Brazil. Areas of significantly high and low accident risk were identified in relation to mean risk in the study region (p < 0.01). Work accident risk for informal workers varied significantly in the study area. Significant age, gender, and occupational group effects on accident risk were identified after correcting for this spatial variation. A good understanding of high-risk groups and high-risk regions underpins the formulation of hypotheses concerning accident causality and the development of effective public accident prevention policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the final purpose of adding value to Amorim Turismo, several papers were analysed, key stakeholders were heard, competitors were studied and so was the market. After this evaluation, it was concluded that there is a chance to consolidate the quality of the service offered and it was with this goal in mind that several recommendations were given. However, such recommendations suffer a cost restriction, which was not neglected, and should be considered into further complementary research activity. Risk assessment was also conducted so that future issues can be anticipated and dealt with preventively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Investors value the special attributes of monetary assets (e.g., exchangeability, liquidity, and safety) and pay a premium for holding them in the form of a lower return rate -- The user cost of holding monetary assets can be measured approximately by the difference between the returns on illiquid risky assets and those of safer liquid assets -- A more appropriate measure should adjust this difference by the differential risk of the assets in question -- We investigate the impact that time non-separable preferences has on the estimation of the risk-adjusted user cost of money -- Using U.K. data from 1965Q1 to 2011Q1, we estimate a habit-based asset pricing model with money in the utility function and find that the risk adjustment for risky monetary assets is negligible -- Thus, researchers can dispense with risk adjusting the user cost of money in constructing monetary aggregate indexes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – Curve fitting from unordered noisy point samples is needed for surface reconstruction in many applications -- In the literature, several approaches have been proposed to solve this problem -- However, previous works lack formal characterization of the curve fitting problem and assessment on the effect of several parameters (i.e. scalars that remain constant in the optimization problem), such as control points number (m), curve degree (b), knot vector composition (U), norm degree (k), and point sample size (r) on the optimized curve reconstruction measured by a penalty function (f) -- The paper aims to discuss these issues -- Design/methodology/approach - A numerical sensitivity analysis of the effect of m, b, k and r on f and a characterization of the fitting procedure from the mathematical viewpoint are performed -- Also, the spectral (frequency) analysis of the derivative of the angle of the fitted curve with respect to u as a means to detect spurious curls and peaks is explored -- Findings - It is more effective to find optimum values for m than k or b in order to obtain good results because the topological faithfulness of the resulting curve strongly depends on m -- Furthermore, when an exaggerate number of control points is used the resulting curve presents spurious curls and peaks -- The authors were able to detect the presence of such spurious features with spectral analysis -- Also, the authors found that the method for curve fitting is robust to significant decimation of the point sample -- Research limitations/implications - The authors have addressed important voids of previous works in this field -- The authors determined, among the curve fitting parameters m, b and k, which of them influenced the most the results and how -- Also, the authors performed a characterization of the curve fitting problem from the optimization perspective -- And finally, the authors devised a method to detect spurious features in the fitting curve -- Practical implications – This paper provides a methodology to select the important tuning parameters in a formal manner -- Originality/value - Up to the best of the knowledge, no previous work has been conducted in the formal mathematical evaluation of the sensitivity of the goodness of the curve fit with respect to different possible tuning parameters (curve degree, number of control points, norm degree, etc.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigated the rate of human papillomavirus (HPV) persistence, associated risk factors, and predictors of cytological alteration outcomes in a cohort of human immunodeficiency virus-infected pregnant women over an 18-month period. HPV was typed through L1 gene sequencing in cervical smears collected during gestation and at 12 months after delivery. Outcomes were defined as nonpersistence (clearance of the HPV in the 2nd sample), re-infection (detection of different types of HPV in the 2 samples), and type-specific HPV persistence (the same HPV type found in both samples). An unfavourable cytological outcome was considered when the second exam showed progression to squamous intraepithelial lesion or high squamous intraepithelial lesion. Ninety patients were studied. HPV DNA persistence occurred in 50% of the cases composed of type-specific persistence (30%) or re-infection (20%). A low CD4+ T-cell count at entry was a risk factor for type-specific, re-infection, or HPV DNA persistence. The odds ratio (OR) was almost three times higher in the type-specific group when compared with the re-infection group (OR = 2.8; 95% confidence interval: 0.43-22.79). Our findings show that bonafide (type-specific) HPV persistence is a stronger predictor for the development of cytological abnormalities, highlighting the need for HPV typing as opposed to HPV DNA testing in the clinical setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The service of a critical infrastructure, such as a municipal wastewater treatment plant (MWWTP), is taken for granted until a flood or another low frequency, high consequence crisis brings its fragility to attention. The unique aspects of the MWWTP call for a method to quantify the flood stage-duration-frequency relationship. By developing a bivariate joint distribution model of flood stage and duration, this study adds a second dimension, time, into flood risk studies. A new parameter, inter-event time, is developed to further illustrate the effect of event separation on the frequency assessment. The method is tested on riverine, estuary and tidal sites in the Mid-Atlantic region. Equipment damage functions are characterized by linear and step damage models. The Expected Annual Damage (EAD) of the underground equipment is further estimated by the parametric joint distribution model, which is a function of both flood stage and duration, demonstrating the application of the bivariate model in risk assessment. Flood likelihood may alter due to climate change. A sensitivity analysis method is developed to assess future flood risk by estimating flood frequency under conditions of higher sea level and stream flow response to increased precipitation intensity. Scenarios based on steady and unsteady flow analysis are generated for current climate, future climate within this century, and future climate beyond this century, consistent with the WWTP planning horizons. The spatial extent of flood risk is visualized by inundation mapping and GIS-Assisted Risk Register (GARR). This research will help the stakeholders of the critical infrastructure be aware of the flood risk, vulnerability, and the inherent uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Doctor of Philosophy in the Faculty of Business Administration

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Asphyxia is considered an important cause of morbidity and mortality in neonates. This condition can affect many vital organs including the central nervous system and may eventually lead to death or developmental disorders. Objectives: Considering the high prevalence of asphyxia and its adverse consequences, the present study was conducted to evaluate the risk factors for birth asphyxia and assess their correlation with prognosis in asphyxiated infants. Patients and Methods: This two-year follow-up cohort study was conducted on 260 infants (110 asphyxiated infants and 150 healthy neonates) at Mashhad Ghaem Hospital during 2007 - 2014. Data collection tools consisted of a researcher-designed questionnaire including maternal and neonatal information and clinical/laboratory test results. The subjects were followed-up, using Denver II test for 6, 12, 18, and 24 months (after discharge). For data analysis, t-test was performed, using SPSS version 16.5. P value ≤ 0.05 was considered statistically significant. Results: Of 260 neonates, 199 (76.5%) and 61 (23.5%) cases presented with normal neonatal outcomes and with abnormal neonatal outcomes (developmental delay), respectively. Variables such as the severity of asphyxia (P = 0.000), five-minute Apgar score (P = 0.015), need for ventilation (P = 0.000), and severity of acidosis at birth (P = 0.001) were the major prognostic factors in infants with asphyxia. Additionally, prognosis was significantly poorer in boys and infants with dystocia history (P = 0.000). Conclusions: Prevalence of risk factors for developmental delay including the severity of asphyxia need for mechanical ventilation, and severity of acidosis at birth, dystocia, and Apgar score were lower in surviving infants; therefore, controlling these risk factors may reduce asphyxia-associated complications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The use of sagittal abdominal diameter (SAD) has been proposed for screening cardio-metabolic risk factors; however, its accuracy can be influenced by the choice of thresholds values. Aim: To determine the SAD threshold values for cardio-metabolic risk factors in Mexican adults; to assess whether parallel and serial SAD testing can improve waist circumference (WC) sensitivity and specificity; and to analyze the effect of considering SAD along with WC and body mass index (BMI) in detecting cardio-metabolic risk. Methods: This cross-sectional study was conducted during 2012-2014 in Northeast Mexico (n = 269). Data on anthropometric, clinical, and biochemical measurements were collected. Sex-adjusted receiver-operating characteristic curves (ROC) were obtained using hypertension, dysglycemia, dyslipidemia and insulin resistance as individual outcomes and metabolic syndrome as a composite outcome. Age-adjusted odds ratios and 95% confidence intervals (CI) were estimated using logistic regression. Results: The threshold value for SAD with acceptable combination of sensitivity and specificity was 24.6 cm in men and 22.5 cm in women. Parallel SAD testing improved WC sensitivity and serial testing improved WC specificity. The co-occurrence of high WC/high SAD increased the risk for insulin resistance by 2.4-fold (95% CI: 1.1-5.3), high BMI/high SAD by 4.3-fold (95% CI: 1.7-11.9) and SAD alone by 2.2-fold (95% CI: 1.2.-4.2). Conclusions: The use of SAD together with traditional obesity indices such as WC and BMI has advantages over using either of these indices alone. SAD may be a powerful screening tool for interventions for high-risk individuals.