970 resultados para Quality improvements
Resumo:
Wheat (Triticum aestivum L.), rice (Oryza sativa L.), and maize (Zea mays L.) provide about two-thirds of all energy in human diets, and four major cropping systems in which these cereals are grown represent the foundation of human food supply. Yield per unit time and land has increased markedly during the past 30 years in these systems, a result of intensified crop management involving improved germplasm, greater inputs of fertilizer, production of two or more crops per year on the same piece of land, and irrigation. Meeting future food demand while minimizing expansion of cultivated area primarily will depend on continued intensification of these same four systems. The manner in which further intensification is achieved, however, will differ markedly from the past because the exploitable gap between average farm yields and genetic yield potential is closing. At present, the rate of increase in yield potential is much less than the expected increase in demand. Hence, average farm yields must reach 70–80% of the yield potential ceiling within 30 years in each of these major cereal systems. Achieving consistent production at these high levels without causing environmental damage requires improvements in soil quality and precise management of all production factors in time and space. The scope of the scientific challenge related to these objectives is discussed. It is concluded that major scientific breakthroughs must occur in basic plant physiology, ecophysiology, agroecology, and soil science to achieve the ecological intensification that is needed to meet the expected increase in food demand.
Resumo:
Globally, increasing demands for biofuels have intensified the rate of land-use change (LUC) for expansion of bioenergy crops. In Brazil, the world\'s largest sugarcane-ethanol producer, sugarcane area has expanded by 35% (3.2 Mha) in the last decade. Sugarcane expansion has resulted in extensive pastures being subjected to intensive mechanization and large inputs of agrochemicals, which have direct implications on soil quality (SQ). We hypothesized that LUC to support sugarcane expansion leads to overall SQ degradation. To test this hypothesis we conducted a field-study at three sites in the central-southern region, to assess the SQ response to the primary LUC sequence (i.e., native vegetation to pasture to sugarcane) associated to sugarcane expansion in Brazil. At each land use site undisturbed and disturbed soil samples were collected from the 0-10, 10-20 and 20-30 cm depths. Soil chemical and physical attributes were measured through on-farm and laboratory analyses. A dataset of soil biological attributes was also included in this study. Initially, the LUC effects on each individual soil indicator were quantified. Afterward, the LUC effects on overall SQ were assessed using the Soil Management Assessment Framework (SMAF). Furthermore, six SQ indexes (SQI) were developed using approaches with increasing complexity. Our results showed that long-term conversion from native vegetation to extensive pasture led to soil acidification, significant depletion of soil organic carbon (SOC) and macronutrients [especially phosphorus (P)] and severe soil compaction, which creates an unbalanced ratio between water- and air-filled pore space within the soil and increases mechanical resistance to root growth. Conversion from pasture to sugarcane improved soil chemical quality by correcting for acidity and increasing macronutrient levels. Despite those improvements, most of the P added by fertilizer accumulated in less plant-available P forms, confirming the key role of organic P has in providing available P to plants in Brazilian soils. Long-term sugarcane production subsequently led to further SOC depletions. Sugarcane production had slight negative impacts on soil physical attributes compared to pasture land. Although tillage performed for sugarcane planting and replanting alleviates soil compaction, our data suggested that the effects are short-term with persistent, reoccurring soil consolidation that increases erosion risk over time. These soil physical changes, induced by LUC, were detected by quantitative soil physical properties as well as by visual evaluation of soil structure (VESS), an on-farm and user-friendly method for evaluating SQ. The SMAF efficiently detected overall SQ response to LUC and it could be reliably used under Brazilian soil conditions. Furthermore, since all of the SQI values developed in this study were able to rank SQ among land uses. We recommend that simpler and more cost-effective SQI strategies using a small number of carefully chosen soil indicators, such as: pH, P, K, VESS and SOC, and proportional weighting within of each soil sectors (chemical, physical and biological) be used as a protocol for SQ assessments in Brazilian sugarcane areas. The SMAF and SQI scores suggested that long-term conversion from native vegetation to extensive pasture depleted overall SQ, driven by decreases in chemical, physical and biological indicators. In contrast, conversion from pasture to sugarcane had no negative impacts on overall SQ, mainly because chemical improvements offset negative impacts on biological and physical indicators. Therefore, our findings can be used as scientific base by farmers, extension agents and public policy makers to adopt and develop management strategies that sustain and/or improving SQ and the sustainability of sugarcane production in Brazil.
Resumo:
The increasing economic competition drives the industry to implement tools that improve their processes efficiencies. The process automation is one of these tools, and the Real Time Optimization (RTO) is an automation methodology that considers economic aspects to update the process control in accordance with market prices and disturbances. Basically, RTO uses a steady-state phenomenological model to predict the process behavior, and then, optimizes an economic objective function subject to this model. Although largely implemented in industry, there is not a general agreement about the benefits of implementing RTO due to some limitations discussed in the present work: structural plant/model mismatch, identifiability issues and low frequency of set points update. Some alternative RTO approaches have been proposed in literature to handle the problem of structural plant/model mismatch. However, there is not a sensible comparison evaluating the scope and limitations of these RTO approaches under different aspects. For this reason, the classical two-step method is compared to more recently derivative-based methods (Modifier Adaptation, Integrated System Optimization and Parameter estimation, and Sufficient Conditions of Feasibility and Optimality) using a Monte Carlo methodology. The results of this comparison show that the classical RTO method is consistent, providing a model flexible enough to represent the process topology, a parameter estimation method appropriate to handle measurement noise characteristics and a method to improve the sample information quality. At each iteration, the RTO methodology updates some key parameter of the model, where it is possible to observe identifiability issues caused by lack of measurements and measurement noise, resulting in bad prediction ability. Therefore, four different parameter estimation approaches (Rotational Discrimination, Automatic Selection and Parameter estimation, Reparametrization via Differential Geometry and classical nonlinear Least Square) are evaluated with respect to their prediction accuracy, robustness and speed. The results show that the Rotational Discrimination method is the most suitable to be implemented in a RTO framework, since it requires less a priori information, it is simple to be implemented and avoid the overfitting caused by the Least Square method. The third RTO drawback discussed in the present thesis is the low frequency of set points update, this problem increases the period in which the process operates at suboptimum conditions. An alternative to handle this problem is proposed in this thesis, by integrating the classic RTO and Self-Optimizing control (SOC) using a new Model Predictive Control strategy. The new approach demonstrates that it is possible to reduce the problem of low frequency of set points updates, improving the economic performance. Finally, the practical aspects of the RTO implementation are carried out in an industrial case study, a Vapor Recompression Distillation (VRD) process located in Paulínea refinery from Petrobras. The conclusions of this study suggest that the model parameters are successfully estimated by the Rotational Discrimination method; the RTO is able to improve the process profit in about 3%, equivalent to 2 million dollars per year; and the integration of SOC and RTO may be an interesting control alternative for the VRD process.
Resumo:
Bibliography: leaves [19-20].
Resumo:
The purpose of this investigation was to evaluate the impact of undertaking peripheral blood stem cell transplantation (PBST) on quality of life (QoL), and to determine the effect of participating in a mixed-type, moderate-intensity exercise program on QoL. It was also an objective to determine the relationship between peak aerobic capacity and QoL in PBST patients. QoL was assessed via the CARES questionnaire and peak aerobic capacity by a maximal graded treadmill test, pretransplant (PI), post transplant (PII) and following a 12-week intervention period (PIII). At PII, 12 patients were divided equally into a control or exercise intervention group. Undergoing a PBST was associated with a statistically but not clinically significant decline in QoL (P < 0.05). Following the intervention, exercising patients demonstrated an improved QoL when compared with pretransplant ratings (P < 0.01) and nonexercising transplant patients (P < 0.05). Moreover, peak aerobic capacity and QoL were correlated (P < 0.05). The findings demonstrated that exercise participation following oncology treatment is associated with a reduction in the number and severity of endorsed problems, which in turn leads to improvements in global, physical and psychosocial QoL. Furthermore, a relationship between fitness and QoL exists, with those experiencing higher levels of fitness also demonstrating higher QoL.
Resumo:
This paper examines the impact that the introduction of a closing call auction had on market quality at the London Stock Exchange. Using estimates from the partial adjustment with noise model of Amihud and Mendelson [Amihud, Y., Mendelson, H., 1987. Trading mechanisms and stock returns: An empirical investigation. Journal of Finance 42, 533–553] we show that opening and closing market quality improved for participating stocks. When we stratify our sample securities into five groups based on trading activity we find that the least active securities experience the greatest improvements to market quality. A control sample of stocks are not characterized by discernable changes to market quality.
Resumo:
This research explores the role of internal customers in the delivery of external service quality. It will consider any potentially different internal customer types that may exist within the organisation. Additionally, it will explore any potential differences in the dimensions that are used to measure service quality internally and externally. If there are different internal customer types then there may be different dimensions which are used to measure service quality between these types and this will be considered also. The approach adopted given the depth and breadth of understanding required, was an action research case based approach. The research objectives were:(i) To determine the dimensions of internal service quality between internal customer supplier cells. (ii) To determine what variation, if any, there is in the dimension sets between internal customer supplier cells. (iii) To determine any ranking in the dimensions that could exist by internal customer supplier cell type. (iv) To investigate the impact of internal service quality on external service quality over time. The research findings were: (i) The majority of the dimensions used in measuring external service quality were also used internally. There were additions of new dimensions however and some dimensions which were used externally, for internal use, had to be redefined. (ii) Variation in dimension sets were revealed during the research. Four different dimension sets were identified and these were matched with four different types of internal service interaction. (iii) Differences in the ranking of dimensions within each dimension set for each internal customer supplier cell type were confirmed. (iv) Internal service quality was seen to influence external service quality but at a cellular level rather than company level. At the company level, the average internal service quality at the start and finish of the research showed no improvement but external service quality had improved. Further investigation at the cellular level showed that improvements in internal service quality had occurred. Those improvements were found to be with the cells that were closest to the customer.The research implications were found to be: (i) some cells may not be necessary in the delivery of external service quality. (ii) The immediacy of the cell to the external customer and number of interactions into and out of that cell has the greatest effect on external customer satisfaction. (iii) Internal service quality may be driven by the customer affecting those cells at the front end of the business first. This then cascades back to those cells which are less immediate until ultimately the whole organisation shows improvements in internal service quality.
Resumo:
Quality, production and technological innovation management rank among the most important matters of concern to modern manufacturing organisations. They can provide companies with the decisive means of gaining a competitive advantage, especially within industries where there is an increasing similarity in product design and manufacturing processes. The papers in this special issue of International Journal of Technology Management have all been selected as examples of how aspects of quality, production and technological innovation can help to improve competitive performance. Most are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the theme was 'Getting Ahead Through Technology and People'. At the conference itself over 80 papers were presented by authors from 15 countries around the world. Among the many topics addressed within the conference theme, technological innovation, quality and production management emerged as attracting the greatest concern and interest of delegates, particularly those from industry. For any new initiative to be implemented successfully, it should be led from the top of the organization. Achieving the desired level of commitment from top management can, however, be a difficulty. In the first paper of this issue, Mackness investigates this question by explaining how systems thinking can help. In the systems approach, properties such as 'emergence', 'hierarchy', 'commnication' and 'control' are used to assist top managers in preparing for change. Mackness's paper is then complemented by Iijima and Hasegawa's contribution in which they investigate the development of Quality Information Management (QIM) in Japan. They present the idea of a Design Review and demonstrate how it can be used to trace and reduce quality-related losses. The next paper on the subject of quality is by Whittle and colleagues. It relates to total quality and the process of culture change within organisations. Using the findings of investigations carried out in a number of case study companies, they describe four generic models which have been identified as characterising methods of implementing total quality within existing organisation cultures. Boaden and Dale's paper also relates to the management of quality, but looks specifically at the construction industry where it has been found there is still some confusion over the role of Quality Assurance (QA) and Total Quality Management (TQM). They describe the results of a questionnaire survey of forty companies in the industry and compare them to similar work carried out in other industries. Szakonyi's contribution then completes this group of papers which all relate specifically to the question of quality. His concern is with the two ways in which R&D or engineering managers can work on improving quality. The first is by improving it in the laboratory, while the second is by working with other functions to improve quality in the company. The next group of papers in this issue all address aspects of production management. Umeda's paper proposes a new manufacturing-oriented simulation package for production management which provides important information for both design and operation of manufacturing systems. A simulation for production strategy in a Computer Integrated Manufacturing (CIM) environment is also discussed. This paper is then followed by a contribution by Tanaka and colleagues in which they consider loading schedules for manufacturing orders in a Material Requirements Planning (MRP) environment. They compare mathematical programming with a knowledge-based approach, and comment on their relative effectiveness for different practical situations. Engstrom and Medbo's paper then looks at a particular aspect of production system design, namely the question of devising group working arrangements for assembly with new product structures. Using the case of a Swedish vehicle assembly plant where long cycle assembly work has been adopted, they advocate the use of a generally applicable product structure which can be adapted to suit individual local conditions. In the last paper of this particular group, Tay considers how automation has affected the production efficiency in Singapore. Using data from ten major industries he identifies several factors which are positively correlated with efficiency, with capital intensity being of greatest interest to policy makers. The two following papers examine the case of electronic data interchange (EDI) as a means of improving the efficiency and quality of trading relationships. Banerjee and Banerjee consider a particular approach to material provisioning for production systems using orderless inventory replenishment. Using the example of a single supplier and multiple buyers they develop an analytical model which is applicable for the exchange of information between trading partners using EDI. They conclude that EDI-based inventory control can be attractive from economic as well as other standpoints and that the approach is consistent with and can be instrumental in moving towards just-in-time (JIT) inventory management. Slacker's complementary viewpoint on EDI is from the perspective of the quality relation-ship between the customer and supplier. Based on the experience of Lucas, a supplier within the automotive industry, he concludes that both banks and trading companies must take responsibility for the development of payment mechanisms which satisfy the requirements of quality trading. The three final papers of this issue relate to technological innovation and are all country based. Berman and Khalil report on a survey of US technological effectiveness in the global economy. The importance of education is supported in their conclusions, although it remains unclear to what extent the US government can play a wider role in promoting technological innovation and new industries. The role of technology in national development is taken up by Martinsons and Valdemars who examine the case of the former Soviet Union. The failure to successfully infuse technology into Soviet enterprises is seen as a factor in that country's demise, and it is anticipated that the newly liberalised economies will be able to encourage greater technological creativity. This point is then taken up in Perminov's concluding paper which looks in detail at Russia. Here a similar analysis is made of the concluding paper which looks in detail at Russia. Here a similar analysis is made of the Soviet Union's technological decline, but a development strategy is also presented within the context of the change from a centralised to a free market economy. The papers included in this special issue of the International Journal of Technology Management each represent a unique and particular contribution to their own specific area of concern. Together, however, they also argue or demonstrate the general improvements in competitive performance that can be achieved through the application of modern principles and practice to the management of quality, production and technological innovation.
Resumo:
Purpose: To evaluate lenses produced by excimer laser ablation of poly(methyl methacrylate) (PMMA) plates. Setting: University research laboratory. Methods: Two Nidek EC-5000 scanning-slit excimer laser systems were used to ablate plane-parallel plates of PMMA. The ablated lenses were examined by focimetry, interferometry, and mechanical surface profiling. Results: The spherical optical powers of the lenses matched the expected values, but the cylindrical powers were generally lower than intended. Interferometry revealed marked irregularity in the surface of negative corrections, which often had a positive “island” at their center. Positive corrections were generally smoother. These findings were supported by the results of mechanical profiling. Contrast sensitivity measurements carried out when observing through ablated lenses whose power had been neutralized with a suitable spectacle lens of opposite sign confirmed that the surface irregularities of the ablated lenses markedly reduced contrast sensitivity over a range of spatial frequencies. Conclusion: Improvements in beam delivery systems seem desirable.
Resumo:
Purpose: To compare monochromatic aberrations of keratoconic eyes when uncorrected, corrected with spherically-powered RGP (rigid gas-permeable) contact lenses and corrected using simulations of customised soft contact lenses for different magnitudes of rotation (up to 15°) and translation (up to 1mm) from their ideal position. Methods: The ocular aberrations of examples of mild, moderate and severe keratoconic eyes were measured when uncorrected and when wearing their habitual RGP lenses. Residual aberrations and point-spread functions of each eye were simulated using an ideal, customised soft contact lens (designed to neutralise higher-order aberrations, HOA) were calculated as a function of the angle of rotation of the lens from its ideal orientation, and its horizontal and vertical translation. Results: In agreement with the results of other authors, the RGP lenses markedly reduced both lower-order aberrations and HOA for all three patients. When compared with the RGP lens corrections, the customised lens simulations only provided optical improvements if their movements were constrained within limits which appear to be difficult to achieve with current technologies. Conclusions: At the present time, customised contact lens corrections appear likely to offer, at best, only minor optical improvements over RGP lenses for patients with keratoconus. If made in soft materials, however, these lenses may be preferred by patients in term of comfort. © 2012 The College of Optometrists.
Resumo:
The population of older adults is rapidly increasing, creating a need for community services that assist vulnerable older adults in maintaining independence and quality of life. Recent evidence confirms the importance of food and nutrition in reaching this objective. The Elderly Nutrition Program (ENP) is part of a system of federally funded community based programs, authorized through the Older Americans Act. ENP services include the home-delivered meals program, which targets frail homebound older adults at nutritional risk. Traditionally, ENP services provide a noon meal 5 days/week. This study evaluated the impact of expanding the home-delivered meals service to include breakfast + lunch, on the nutritional status, quality of life and health care utilization of program participants. ^ This cross-sectional study compared 2 groups. The Breakfast group (n = 167) received a home-delivered breakfast + lunch, 5 days/week. The Comparison group (n = 214) received lunch 5 days/week. Participants, recruited from 5 ENP programs, formed a geographically, racially/ethnically diverse sample. Participants ranged in age from 60–100 years, they were functionally limited, at high nutritional risk, low income, and they lived alone and had difficulty shopping or preparing food. Participant data were collected through in-home interviews and program records. A 24-hour food recall and information on participant demographics, malnutrition risk, functional status, health care use, and applicable quality of life factors were obtained. Service and cost data were collected from program administrators. ^ Breakfast group participants had greater energy/nutrient intakes (p < .05), fewer health care contacts (p < .05), and greater quality of life measured as food security (p < .05) and fewer depressive symptoms (p < .05), than comparison group participants. These benefits were achieved for $1.30/person/day. ^ The study identified links from improvements in nutritional status to enhanced quality of life to diminished health care utilization and expenditures. A model of health, loneliness, food enjoyment, food insecurity, and depression as factors contributing to quality of life for this population, was proposed and tested (p < .01). ^ The breakfast service is an inexpensive addition to traditional home-delivered meals services and can improve the lives of frail homebound older adults. Agencies should be encouraged to expand meals programs to include a breakfast service. ^
Resumo:
This research document is motivated by the need for a systemic, efficient quality improvement methodology at universities. There exists no methodology designed for a total quality management (TQM) program in a university. The main objective of this study is to develop a TQM Methodology that enables a university to efficiently develop an integral total quality improvement (TQM) Plan. ^ Current research focuses on the need of improving the quality of universities, the study of the perceived best quality universities, and the measurement of the quality of universities through rankings. There is no evidence of research on how to plan for an integral quality improvement initiative for the university as a whole, which is the main contribution of this study. ^ This research is built on various reference TQM models and criteria provided by ISO 9000, Baldrige and Six Sigma; and educational accreditation criteria found in ABET and SACS. The TQM methodology is proposed by following a seven-step metamethodology. The proposed methodology guides the user to develop a TQM plan in five sequential phases: initiation, assessment, analysis, preparation and acceptance. Each phase defines for the user its purpose, key activities, input requirements, controls, deliverables, and tools to use. The application of quality concepts in education and higher education is particular; since there are unique factors in education which ought to be considered. These factors shape the quality dimensions in a university and are the main inputs to the methodology. ^ The proposed TQM Methodology is used to guide the user to collect and transform appropriate inputs to a holistic TQM Plan, ready to be implemented by the university. Different input data will lead to a unique TQM plan for the specific university at the time. It may not necessarily transform the university into a world-class institution, but aims to strive for stakeholder-oriented improvements, leading to a better alignment with its mission and total quality advancement. ^ The proposed TQM methodology is validated in three steps. First, it is verified by going through a test activity as part of the meta-methodology. Secondly, the methodology is applied to a case university to develop a TQM plan. Lastly, the methodology and the TQM plan both are verified by an expert group consisting of TQM specialists and university administrators. The proposed TQM methodology is applicable to any university at all levels of advancement, regardless of changes in its long-term vision and short-term needs. It helps to assure the quality of a TQM plan, while making the process more systemic, efficient, and cost effective. This research establishes a framework with a solid foundation for extending the proposed TQM methodology into other industries. ^
Resumo:
X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].
Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.
As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.
More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.
With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.
Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.
With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.
Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.
Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.
Resumo:
Knight M, Acosta C, Brocklehurst P, Cheshire A, Fitzpatrick K, Hinton L, Jokinen M, Kemp B, Kurinczuk JJ, Lewis G, Lindquist A, Locock L, Nair M, Patel N, Quigley M, Ridge D, Rivero-Arias O, Sellers S, Shah A on behalf of the UKNeS coapplicant group. Background Studies of maternal mortality have been shown to result in important improvements to women’s health. It is now recognised that in countries such as the UK, where maternal deaths are rare, the study of near-miss severe maternal morbidity provides additional information to aid disease prevention, treatment and service provision. Objectives To (1) estimate the incidence of specific near-miss morbidities; (2) assess the contribution of existing risk factors to incidence; (3) describe different interventions and their impact on outcomes and costs; (4) identify any groups in which outcomes differ; (5) investigate factors associated with maternal death; (6) compare an external confidential enquiry or a local review approach for investigating quality of care for affected women; and (7) assess the longer-term impacts. Methods Mixed quantitative and qualitative methods including primary national observational studies, database analyses, surveys and case studies overseen by a user advisory group. Setting Maternity units in all four countries of the UK. Participants Women with near-miss maternal morbidities, their partners and comparison women without severe morbidity. Main outcome measures The incidence, risk factors, management and outcomes of uterine rupture, placenta accreta, haemolysis, elevated liver enzymes and low platelets (HELLP) syndrome, severe sepsis, amniotic fluid embolism and pregnancy at advanced maternal age (≥ 48 years at completion of pregnancy); factors associated with progression from severe morbidity to death; associations between severe maternal morbidity and ethnicity and socioeconomic status; lessons for care identified by local and external review; economic evaluation of interventions for management of postpartum haemorrhage (PPH); women’s experiences of near-miss maternal morbidity; long-term outcomes; and models of maternity care commissioned through experience-led and standard approaches. Results Women and their partners reported long-term impacts of near-miss maternal morbidities on their physical and mental health. Older maternal age and caesarean delivery are associated with severe maternal morbidity in both current and future pregnancies. Antibiotic prescription for pregnant or postpartum women with suspected infection does not necessarily prevent progression to severe sepsis, which may be rapidly progressive. Delay in delivery, of up to 48 hours, may be safely undertaken in women with HELLP syndrome in whom there is no fetal compromise. Uterine compression sutures are a cost-effective second-line therapy for PPH. Medical comorbidities are associated with a fivefold increase in the odds of maternal death from direct pregnancy complications. External reviews identified more specific clinical messages for care than local reviews. Experience-led commissioning may be used as a way to commission maternity services. Limitations This programme used observational studies, some with limited sample size, and the possibility of uncontrolled confounding cannot be excluded. Conclusions Implementation of the findings of this research could prevent both future severe pregnancy complications as well as improving the outcome of pregnancy for women. One of the clearest findings relates to the population of women with other medical and mental health problems in pregnancy and their risk of severe morbidity. Further research into models of pre-pregnancy, pregnancy and postnatal care is clearly needed.