793 resultados para Case studies
Resumo:
Knight M, Acosta C, Brocklehurst P, Cheshire A, Fitzpatrick K, Hinton L, Jokinen M, Kemp B, Kurinczuk JJ, Lewis G, Lindquist A, Locock L, Nair M, Patel N, Quigley M, Ridge D, Rivero-Arias O, Sellers S, Shah A on behalf of the UKNeS coapplicant group. Background Studies of maternal mortality have been shown to result in important improvements to women’s health. It is now recognised that in countries such as the UK, where maternal deaths are rare, the study of near-miss severe maternal morbidity provides additional information to aid disease prevention, treatment and service provision. Objectives To (1) estimate the incidence of specific near-miss morbidities; (2) assess the contribution of existing risk factors to incidence; (3) describe different interventions and their impact on outcomes and costs; (4) identify any groups in which outcomes differ; (5) investigate factors associated with maternal death; (6) compare an external confidential enquiry or a local review approach for investigating quality of care for affected women; and (7) assess the longer-term impacts. Methods Mixed quantitative and qualitative methods including primary national observational studies, database analyses, surveys and case studies overseen by a user advisory group. Setting Maternity units in all four countries of the UK. Participants Women with near-miss maternal morbidities, their partners and comparison women without severe morbidity. Main outcome measures The incidence, risk factors, management and outcomes of uterine rupture, placenta accreta, haemolysis, elevated liver enzymes and low platelets (HELLP) syndrome, severe sepsis, amniotic fluid embolism and pregnancy at advanced maternal age (≥ 48 years at completion of pregnancy); factors associated with progression from severe morbidity to death; associations between severe maternal morbidity and ethnicity and socioeconomic status; lessons for care identified by local and external review; economic evaluation of interventions for management of postpartum haemorrhage (PPH); women’s experiences of near-miss maternal morbidity; long-term outcomes; and models of maternity care commissioned through experience-led and standard approaches. Results Women and their partners reported long-term impacts of near-miss maternal morbidities on their physical and mental health. Older maternal age and caesarean delivery are associated with severe maternal morbidity in both current and future pregnancies. Antibiotic prescription for pregnant or postpartum women with suspected infection does not necessarily prevent progression to severe sepsis, which may be rapidly progressive. Delay in delivery, of up to 48 hours, may be safely undertaken in women with HELLP syndrome in whom there is no fetal compromise. Uterine compression sutures are a cost-effective second-line therapy for PPH. Medical comorbidities are associated with a fivefold increase in the odds of maternal death from direct pregnancy complications. External reviews identified more specific clinical messages for care than local reviews. Experience-led commissioning may be used as a way to commission maternity services. Limitations This programme used observational studies, some with limited sample size, and the possibility of uncontrolled confounding cannot be excluded. Conclusions Implementation of the findings of this research could prevent both future severe pregnancy complications as well as improving the outcome of pregnancy for women. One of the clearest findings relates to the population of women with other medical and mental health problems in pregnancy and their risk of severe morbidity. Further research into models of pre-pregnancy, pregnancy and postnatal care is clearly needed.
Resumo:
This paper describes and analyses the Audiovisual Technology Hub Programme (Programa Polos Audiovisuales Tecnológicos - PPAT), which has been implemented in Argentina between 2010 and 2015 as part of the public policy of former administration of Cristina Fernández de Kirchner. The main goal was to promote a television industry that reflects the cultural diversity of Argentina by dividing the national territory in nine into nine audiovisual technology hubs, where national public universities acted as centres that gathered a range of regional stakeholders. Considering the 18 TV seasons that were produced for television between 2013 and 2014, the text analyses the diversity of sources and genres / subgenres and its restricted marketing. The article closes with a brief set of conclusions about this initiative.
Resumo:
In the deregulated Power markets it is necessary to have a appropriate Transmission Pricing methodology that also takes into account “Congestion and Reliability”, in order to ensure an economically viable, equitable, and congestion free power transfer capability, with high reliability and security. This thesis presents results of research conducted on the development of a Decision Making Framework (DMF) of concepts and data analytic and modelling methods for the Reliability benefits Reflective Optimal “cost evaluation for the calculation of Transmission Cost” for composite power systems, using probabilistic methods. The methodology within the DMF devised and reported in this thesis, utilises a full AC Newton-Raphson load flow and a Monte-Carlo approach to determine, Reliability Indices which are then used for the proposed Meta-Analytical Probabilistic Approach (MAPA) for the evaluation and calculation of the Reliability benefit Reflective Optimal Transmission Cost (ROTC), of a transmission system. This DMF includes methods for transmission line embedded cost allocation among transmission transactions, accounting for line capacity-use as well as congestion costing that can be used for pricing using application of Power Transfer Distribution Factor (PTDF) as well as Bialek’s method to determine a methodology which consists of a series of methods and procedures as explained in detail in the thesis for the proposed MAPA for ROTC. The MAPA utilises the Bus Data, Generator Data, Line Data, Reliability Data and Customer Damage Function (CDF) Data for the evaluation of Congestion, Transmission and Reliability costing studies using proposed application of PTDF and other established/proven methods which are then compared, analysed and selected according to the area/state requirements and then integrated to develop ROTC. Case studies involving standard 7-Bus, IEEE 30-Bus and 146-Bus Indian utility test systems are conducted and reported throughout in the relevant sections of the dissertation. There are close correlation between results obtained through proposed application of PTDF method with the Bialek’s and different MW-Mile methods. The novel contributions of this research work are: firstly the application of PTDF method developed for determination of Transmission and Congestion costing, which are further compared with other proved methods. The viability of developed method is explained in the methodology, discussion and conclusion chapters. Secondly the development of comprehensive DMF which helps the decision makers to analyse and decide the selection of a costing approaches according to their requirements. As in the DMF all the costing approaches have been integrated to achieve ROTC. Thirdly the composite methodology for calculating ROTC has been formed into suits of algorithms and MATLAB programs for each part of the DMF, which are further described in the methodology section. Finally the dissertation concludes with suggestions for Future work.
Resumo:
This dissertation explores why some states consistently secure food imports at prices higher than the world market price, thereby exacerbating food insecurity domestically. I challenge the idea that free market economics alone can explain these trade behaviors, and instead argue that states take into account political considerations when engaging in food trade that results in inefficient trade. In particular, states that are dependent on imports of staple food products, like cereals, are wary of the potential strategic value of these goods to exporters. I argue that this consideration, combined with the importing state’s ability to mitigate that risk through its own forms of political or economic leverage, will shape the behavior of the importing state and contribute to its potential for food security. In addition to cross-national analyses, I use case studies of the Gulf Cooperation Council states and Jordan to demonstrate how the political tools available to these importers affect their food security. The results of my analyses suggest that when import dependent states have access to forms of political leverage, they are more likely to trade efficiently, thereby increasing their potential for food security.
Resumo:
Participation Space Studies explore eParticipation in the day-to-day activities of local, citizen-led groups, working to improve their communities. The focus is the relationship between activities and contexts. The concept of a participation space is introduced in order to reify online and offline contexts where people participate in democracy. Participation spaces include websites, blogs, email, social media presences, paper media, and physical spaces. They are understood as sociotechnical systems: assemblages of heterogeneous elements, with relevant histories and trajectories of development and use. This approach enables the parallel study of diverse spaces, on and offline. Participation spaces are investigated within three case studies, centred on interviews and participant observation. Each case concerns a community or activist group, in Scotland. The participation spaces are then modelled using a Socio-Technical Interaction Network (STIN) framework (Kling, McKim and King, 2003). The participation space concept effectively supports the parallel investigation of the diverse social and technical contexts of grassroots democracy and the relationship between the case-study groups and the technologies they use to support their work. Participants’ democratic participation is supported by online technologies, especially email, and they create online communities and networks around their goals. The studies illustrate the mutual shaping relationship between technology and democracy. Participants’ choice of technologies can be understood in spatial terms: boundaries, inhabitants, access, ownership, and cost. Participation spaces and infrastructures are used together and shared with other groups. Non-public online spaces, such as Facebook groups, are vital contexts for eParticipation; further, the majority of participants’ work is non-public, on and offline. It is informational, potentially invisible, work that supports public outputs. The groups involve people and influence events through emotional and symbolic impact, as well as rational argument. Images are powerful vehicles for this and digital images become an increasingly evident and important feature of participation spaces throughout the consecutively conducted case studies. Collaboration of diverse people via social media indicates that these spaces could be understood as boundary objects (Star and Griesemer, 1989). The Participation Space Studies draw from and contribute to eParticipation, social informatics, mediation, social shaping studies, and ethnographic studies of Internet use.
Resumo:
The effective supplier evaluation and purchasing processes are of vital importance to business organizations, making the suppliers selection problem a fundamental key issue to their success. We consider a complex supplier selection problem with multiple products where minimum package quantities, minimum order values related to delivery costs, and discounted pricing schemes are taken into account. Our main contribution is to present a mixed integer linear programming (MILP) model for this supplier selection problem. The model is used to solve several examples including three real case studies from an electronic equipment assembly company.
Resumo:
The core aim of this paper is to evaluate to what extent were companies able to join to the global value chains (GVCs) through some selected company case studies.
Resumo:
This thesis examines cultural policy for film in Scotland, from 1997 to 2010. It explores the extent to which the industry is shaped by film policy strategies and through the agency of public funding bodies. It reflects on how Scottish Screen, Scotland’s former screen agency, articulated its role as a national institution concerned with both commercial and cultural remits, with the conflicting interests of different industry groups. The study examines how the agency developed funding schemes to fulfil policy directives during a tumultuous period in Scottish cultural policy history, following the establishment of the Scottish Parliament with the Scotland Act 1998 and preceding the Independence Referendum Act 2013. In order to investigate how policy has shaped the development of a national film industry, a further two case studies are explored. These are Tartan Shorts, Scotland’s former flagship short film scheme, and the Audience Development Fund, Scotland’s first project based film exhibition scheme. The first study explores the planning, implementation and evaluation of the scheme as part of the agency’s talent development strategy. The outcomes of this study show the potential impact of funding methods aimed at developing and retaining Scottish filmmaking talent. Thereafter, the Scottish exhibition sector is discussed; a formerly unexplored field within film policy discussions and academic debate. It outlines Scottish Screen’s legacy to current film exhibition funding practices and the practical mechanisms the agency utilised to foster Scottish audiences. By mapping the historical and political terrain, the research analyses the specificity of Scotland within the UK context and explores areas in which short-term, context-driven policies become problematic. The work concludes by presenting the advantages and issues caused by film funding practices, advocating what is needed for the film industry in Scotland today with suggestions for long-term and cohesive policy development.
Resumo:
We propose a method denoted as synthetic portfolio for event studies in market microstructure that is particularly interesting to use with high frequency data and thinly traded markets. The method is based on Synthetic Control Method and provides a robust data driven method to build a counterfactual for evaluating the effects of the volatility call auctions. We find that SMC could be used if the loss function is defined as the difference between the returns of the asset and the returns of a synthetic portfolio. We apply SCM to test the performance of the volatility call auction as a circuit breaker in the context of an event study. We find that for Colombian Stock Market securities, the asynchronicity of intraday data reduces the analysis to a selected group of stocks, however it is possible to build a tracking portfolio. The realized volatility increases after the auction, indicating that the mechanism is not enhancing the price discovery process.
Resumo:
In aircraft components maintenance shops, components are distributed amongst repair groups and their respective technicians based on the type of repair, on the technicians skills and workload, and on the customer required dates. This distribution planning is typically done in an empirical manner based on the group leader’s past experience. Such a procedure does not provide any performance guarantees, leading frequently to undesirable delays on the delivery of the aircraft components. Among others, a fundamental challenge faced by the group leaders is to decide how to distribute the components that arrive without customer required dates. This paper addresses the problems of prioritizing the randomly arriving of aircraft components (with or without pre-assigned customer required dates) and of optimally distributing them amongst the technicians of the repair groups. We proposed a formula for prioritizing the list of repairs, pointing out the importance of selecting good estimators for the interarrival times between repair requests, the turn-around-times and the man hours for repair. In addition, a model for the assignment and scheduling problem is designed and a preliminary algorithm along with a numerical illustration is presented.
Resumo:
The time for conducting Preventive Maintenance (PM) on an asset is often determined using a predefined alarm limit based on trends of a hazard function. In this paper, the authors propose using both hazard and reliability functions to improve the accuracy of the prediction particularly when the failure characteristic of the asset whole life is modelled using different failure distributions for the different stages of the life of the asset. The proposed method is validated using simulations and case studies.
Resumo:
This research work analyses techniques for implementing a cell-centred finite-volume time-domain (ccFV-TD) computational methodology for the purpose of studying microwave heating. Various state-of-the-art spatial and temporal discretisation methods employed to solve Maxwell's equations on multidimensional structured grid networks are investigated, and the dispersive and dissipative errors inherent in those techniques examined. Both staggered and unstaggered grid approaches are considered. Upwind schemes using a Riemann solver and intensity vector splitting are studied and evaluated. Staggered and unstaggered Leapfrog and Runge-Kutta time integration methods are analysed in terms of phase and amplitude error to identify which method is the most accurate and efficient for simulating microwave heating processes. The implementation and migration of typical electromagnetic boundary conditions. from staggered in space to cell-centred approaches also is deliberated. In particular, an existing perfectly matched layer absorbing boundary methodology is adapted to formulate a new cell-centred boundary implementation for the ccFV-TD solvers. Finally for microwave heating purposes, a comparison of analytical and numerical results for standard case studies in rectangular waveguides allows the accuracy of the developed methods to be assessed.
Resumo:
Even though today’s corporations recognize that they need to understand modern project management techniques (Schwalbe, 2002, p2), many researchers continue to provide evidence of poor IT project success. With Kotnour, (2000) finding that project performance is positively associated with project knowledge, a better understanding of how to effectively manage knowledge in IT projects should have considerable practical significance for increasing the chances of project success. Using a combined qualitative/quantitative method of data collection in multiple case studies spanning four continents, and comprising a variety of organizational types, the focus of this current research centered on the question of why individuals working within IT project teams might be motivated towards, or inhibited from, sharing their knowledge and experience in their activities, procedures, and processes. The research concluded with the development of a new theoretical model of knowledge sharing behavior, ‘The Alignment Model of Motivational Focus’. This model suggests that an individual’s propensity to share knowledge and experience is a function of perceived personal benefits and costs associated with the activity, balanced against the individual’s alignment to a group of ‘institutional’ factors. These factors are identified as alignments to the project team, to the organization, and dependent on the circumstances, to either the professional discipline or community of practice, to which the individual belongs.
Resumo:
Networked control over data networks has received increasing attention in recent years. Among many problems in networked control systems (NCSs) is the need to reduce control latency and jitter and to deal with packet dropouts. This paper introduces our recent progress on a queuing communication architecture for real-time NCS applications, and simple strategies for dealing with packet dropouts. Case studies for a middle-scale process or multiple small-scale processes are presented for TCP/IP based real-time NCSs. Variations of network architecture design are modelled, simulated, and analysed for evaluation of control latency and jitter performance. It is shown that a simple bandwidth upgrade or adding hierarchy does not necessarily bring benefits for performance improvement of control latency and jitter. A co-design of network and control is necessary to maximise the real-time control performance of NCSs
Resumo:
Networks have come to occupy a key position in the strategic armoury of the government, business and community sectors and now have impact on a broad array of policy and management arenas. An emphasis on relationships, trust and mutuality mean that networks function on a different operating logic to the conventional processes of government and business. It is therefore important that organizational members of networks are able to adopt the skills and culture necessary to operate successfully under these distinctive kinds of arrangements. Because networks function from a different operational logic to traditional bureaucracies, public sector organizations may experience difficulties in adapting to networked arrangements. Networks are formed to address a variety of social problems or meet capability gaps within organizations. As such they are often under pressure to quickly produce measurable outcomes and need to form rapidly and come to full operation quickly. This paper presents a theoretical exploration of how diverse types of networks are required for different management and policy situations and draws on a set of public sector case studies to understand/demonstrate how these various types of networked arrangements may be ‘turbo-charged’ so that they more quickly adopt the characteristics necessary to deliver required outcomes.