959 resultados para Average Case Complexity
Resumo:
Purpose - The main objective of the paper is to develop a risk management framework for software development projects from developers' perspective. Design/methodology/approach - This study uses a combined qualitative and quantitative technique with the active involvement of stakeholders in order to identify, analyze and respond to risks. The entire methodology has been explained using a case study on software development project in a public sector organization in Barbados. Findings - Analytical approach to managing risk in software development ensures effective delivery of projects to clients. Research limitations/implications - The proposed risk management framework has been applied to a single case. Practical implications - Software development projects are characterized by technical complexity, market and financial uncertainties and competent manpower availability. Therefore, successful project accomplishment depends on addressing those issues throughout the project phases. Effective risk management ensures the success of projects. Originality/value - There are several studies on managing risks in software development and information technology (IT) projects. Most of the studies identify and prioritize risks through empirical research in order to suggest mitigating measures. Although they are important to clients for future projects, these studies fail to provide any framework for risk management from software developers' perspective. Although a few studies introduced framework of risk management in software development, most of them are presented from clients' perspectives and very little effort has been made to integrate this with the software development cycle. As software developers absorb considerable amount of risks, an integrated framework for managing risks in software development from developers' perspective is needed. © Emerald Group Publishing Limited.
Resumo:
Over 60% of the recurrent budget of the Ministry of Health (MoH) in Angola is spent on the operations of the fixed health care facilities (health centres plus hospitals). However, to date, no study has been attempted to investigate how efficiently those resources are used to produce health services. Therefore the objectives of this study were to assess the technical efficiency of public municipal hospitals in Angola; assess changes in productivity over time with a view to analyzing changes in efficiency and technology; and demonstrate how the results can be used in the pursuit of the public health objective of promoting efficiency in the use of health resources. The analysis was based on a 3-year panel data from all the 28 public municipal hospitals in Angola. Data Envelopment Analysis (DEA), a non-parametric linear programming approach, was employed to assess the technical and scale efficiency and productivity change over time using Malmquist index.The results show that on average, productivity of municipal hospitals in Angola increased by 4.5% over the period 2000-2002; that growth was due to improvements in efficiency rather than innovation. © 2008 Springer Science+Business Media, LLC.
Resumo:
Quality management is dominated by rational paradigms for the measurement and management of quality, but these paradigms start to “break down”, when faced with the inherent complexity of managing quality in intensely competitive changing environments. In this article, the various theoretical strategy paradigms employed to manage quality are reviewed and the advantages and limitations of these paradigms are highlighted. A major implication of this review is that when faced with complexity, an ideological stance to any single strategy paradigm for the management of quality is ineffective. A case study is used to demonstrate the need for an integrative multi-paradigm approach to the management of quality as complexity increases.
Resumo:
East-West trade has grown rapidly since the sixties, stimulating a parallel expansion in the literature on the subject. An extensive review of this literature shows how: (i) most of the issues involved have at their source the distinctions between East and West in political ideology and/or economic management, and (ii) there has been a tendency to keep theoretical and practical perspectives on the subject too separate. This thesis demonstrates the importance of understanding the fundamental principles implied in the first point, and represents an attempt to bridge the gap identified in the second. A detailed study of the market for fire fighting equipment in Eastern Europe is undertaken in collaboration with a medium-sized company, Angus Fire Armour Limited. Desk research methods are combined with visits to the market to assess the potential for the company's products, and recommendations for future strategy are made. The case demonstrates the scope and limitations of various research methods for the East European market, and a model for market research relevant to all companies is developed. Tne case study highlights three areas largely neglected in the literature: (i) the problems of internal company adaptation to East European conditions; (ii) the division of responsibility between foreign trade organisations; and (iii) bribery and corruption in East-West trade. Further research into the second topic - through a survey of 36 UK exporters - and the third - through analysis of publicised corruption cases - confirms the representativeness of the Angus experience, and reflects on the complexity of the Bast European import process, which does not always function as is commonly supposed. The very complexity of the problems confronting companies reaffirms the need to appreciate the principles underlying the subject, while the detailed analysis into questions of, originally, a marketing nature, reveals wider implications for East-West trade and East-West relations.
Resumo:
In this poster we presented our preliminary work on the study of spammer detection and analysis with 50 active honeypot profiles implemented on Weibo.com and QQ.com microblogging networks. We picked out spammers from legitimate users by manually checking every captured user's microblogs content. We built a spammer dataset for each social network community using these spammer accounts and a legitimate user dataset as well. We analyzed several features of the two user classes and made a comparison on these features, which were found to be useful to distinguish spammers from legitimate users. The followings are several initial observations from our analysis on the features of spammers captured on Weibo.com and QQ.com. ¦The following/follower ratio of spammers is usually higher than legitimate users. They tend to follow a large amount of users in order to gain popularity but always have relatively few followers. ¦There exists a big gap between the average numbers of microblogs posted per day from these two classes. On Weibo.com, spammers post quite a lot microblogs every day, which is much more than legitimate users do; while on QQ.com spammers post far less microblogs than legitimate users. This is mainly due to the different strategies taken by spammers on these two platforms. ¦More spammers choose a cautious spam posting pattern. They mix spam microblogs with ordinary ones so that they can avoid the anti-spam mechanisms taken by the service providers. ¦Aggressive spammers are more likely to be detected so they tend to have a shorter life while cautious spammers can live much longer and have a deeper influence on the network. The latter kind of spammers may become the trend of social network spammer. © 2012 IEEE.
Resumo:
This study proposes an integrated analytical framework for effective management of project risks using combined multiple criteria decision-making technique and decision tree analysis. First, a conceptual risk management model was developed through thorough literature review. The model was then applied through action research on a petroleum oil refinery construction project in the Central part of India in order to demonstrate its effectiveness. Oil refinery construction projects are risky because of technical complexity, resource unavailability, involvement of many stakeholders and strict environmental requirements. Although project risk management has been researched extensively, practical and easily adoptable framework is missing. In the proposed framework, risks are identified using cause and effect diagram, analysed using the analytic hierarchy process and responses are developed using the risk map. Additionally, decision tree analysis allows modelling various options for risk response development and optimises selection of risk mitigating strategy. The proposed risk management framework could be easily adopted and applied in any project and integrated with other project management knowledge areas.
Resumo:
This paper investigates the associations between audit pricing and multidimensional characteristics of local governments by using a sample of Greek municipalities. The Greek institutional setting is interesting because it is politically pluralistic. Moreover, independent auditors appointed through a bid process exclusively perform the audits. Our results suggest a considerable variation on audit fees which is mainly driven by politically related factors indicating the importance of relevant theoretical anticipations in audit pricing in the public sector. Agency costs appear strong enough to explain audit pricing. We also confirm prior findings on the significance of audit complexity and size. Results also suggest that audit fees are reduced when an internal team dedicated to accrual accounting is appointed. Therefore, our conclusions offer practical implications for policy setters and regulators in the public sector in relation to audit quality.
Resumo:
Significant changes in accounting disclosure are observed in periods of economic change such as those relating to emerging capital markets and programs of privatization. Measurement of the level of accounting disclosure should ideally be designed to capture the complexity of change in order to give insight and explanation to match the causes and consequences of change. This paper shows the added interpretive value in subdividing the disclosure checklist to reflect the requirements of national accounting regulations, the location of disclosure items in the annual report, and limitations on the availability of regulations in official translation to the local language. Defining targeted disclosure categories leads to significance testing of specific aspects of changes in accounting disclosure in the Egyptian capital market in the 1990s. Strong correlation of disclosure with the presence of majority government ownership of the company and the relative activity of share trading supports the applicability of political costs and capital need theories, respectively. The relation between International Accounting Standards (IASs) disclosure and the type of audit firm points to additional theoretical explanations, including relative familiarity with the legislation and compliance features identifiable with the emerging capital market. The approach described in this paper has the potential for enhancing understanding of the complexity of accounting change in other emerging capital markets and developing economies.
Resumo:
Novel molecular complexity measures are designed based on the quantum molecular kinematics. The Hamiltonian matrix constructed in a quasi-topological approximation describes the temporal evolution of the modelled electronic system and determined the time derivatives for the dynamic quantities. This allows to define the average quantum kinematic characteristics closely related to the curvatures of the electron paths, particularly, the torsion reflecting the chirality of the dynamic system. A special attention has been given to the computational scheme for this chirality measure. The calculations on realistic molecular systems demonstrate reasonable behaviour of the proposed molecular complexity indices.
Resumo:
This study examines the information content of alternative implied volatility measures for the 30 components of the Dow Jones Industrial Average Index from 1996 until 2007. Along with the popular Black-Scholes and \model-free" implied volatility expectations, the recently proposed corridor implied volatil- ity (CIV) measures are explored. For all pair-wise comparisons, it is found that a CIV measure that is closely related to the model-free implied volatility, nearly always delivers the most accurate forecasts for the majority of the firms. This finding remains consistent for different forecast horizons, volatility definitions, loss functions and forecast evaluation settings.
Resumo:
As we settle into a new year, this second issue of Contact Lens and Anterior Eye allows us to reflect on how new research in this field impacts our understanding, but more importantly, how we use this evidence basis to enhance our day to day practice, to educate the next generation of students and to construct the research studies to deepen our knowledge still further. The end of 2014 saw the publication of the UK governments Research Exercise Framework (REF) which ranks Universities in terms of their outputs (which includes their paper, publications and research income), environment (infrastructure and staff support) and for the first time impact (defined as “any effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia” [8]). The REF is a process of expert review, carried out in 36 subject-based units of assessment, of which our field is typically submitted to the Allied Health, Dentistry, Nursing and Pharmacy panel. Universities that offer Optometry did very well with Cardiff, Manchester and Aston in the top 10% out of the 94 Universities that submitted to this panel (Grade point Average ranked order). While the format of the new exercise (probably in 2010) to allocate the more than £2 billion of UK government research funds is yet to be determined, it is already rumoured that impact will contribute an even larger proportion to the weighting. Hence it is even more important to reflect on the impact of our research. In this issue, Elisseef and colleagues [5] examine the intriguing potential of modifying a lens surface to allow it to bind to known wetting agents (in this case hyaluronic acid) to enhance water retention. Such a technique has the capacity to reduced friction between the lens surface and the eyelids/ocular surface, presumably leading to higher comfort and less reason for patients to discontinue with lens wear. Several papers in this issue report on the validity of new high precision, fast scanning imaging and quantification equipment, utilising techniques such as Scheimpflug, partial coherence interferometry, aberrometry and video allowing detailed assessment of anterior chamber biometry, corneal topography, corneal biomechanics, peripheral refraction, ocular aberrations and lens fit. The challenge is how to use this advanced instrumentation which is becoming increasingly available to create real impact. Many challenges in contact lenses and the anterior eye still prevail in 2015 such as: -While contact lens and refractive surgery complications are relatively rare, they are still too often devastating to the individual and their quality of life (such as the impact and prognosis of patients with Acanthmoeba Keratitis reported by Jhanji and colleagues in this issue [7]). How can we detect those patients who are going to be affected and what modifications do we need to make to contact lenses and patient management prevent this occurring? -Drop out from contact lenses still occurs at a rapid rate and symptoms of dry eye seem to be the leading cause driving this discontinuation of wear [1] and [2]. What design, coating, material and lubricant release mechanism will make a step change in end of day comfort in particular? -Presbyopia is a major challenge to hassle free quality vision and is one of the first signs of ageing noticed by many people. As an emmetrope approaching presbyopia, I have a vested interest in new medical devices that will give me high quality vision at all distances when my arms won’t stretch any further. Perhaps a new definition of presbyopia could be when you start to orientate your smartphone in the landscape direction to gain the small increase in print size needed to read! Effective accommodating intraocular lenses that truly mimic the pre-presbyopic crystalline lenses are still a way off [3] and hence simultaneous images achieved through contact lenses, intraocular lenses or refractive surgery still have a secure future. However, splitting light reaching the retina and requiring the brain to supress blurred images will always be a compromise on contrast sensitivity and is liable to cause dysphotopsia; so how will new designs account for differences in a patient's task demands and own optical aberrations to allow focused patient selection, optimising satisfaction? -Drug delivery from contact lenses offers much in terms of compliance and quality of life for patients with chronic ocular conditions such as glaucoma, dry eye and perhaps in the future, dry age-related macular degeneration; but scientific proof-of-concept publications (see EIShaer et al. [6]) have not yet led to commercial products. Part of this is presumably the regulatory complexity of combining a medical device (the contact lens) and a pharmaceutical agent. Will 2015 be the year when this innovation finally becomes a reality for patients, bringing them an enhanced quality of life through their eye care practitioners and allowing researchers to further validate the use of pharmaceutical contact lenses and propose enhancements as the technology matures? -Last, but no means least is the field of myopia control, the topic of the first day of the BCLA's Conference in Liverpool, June 6–9th 2015. The epidemic of myopia is a blight, particularly in Asia, with significant concerns over sight threatening pathology resulting from the elongated eye. This is a field where real impact is already being realised through new soft contact lens optics, orthokeratology and low dose pharmaceuticals [4], but we still need to be able to better predict which technique will work best for an individual and to develop new techniques to retard myopia progression in those who don’t respond to current treatments, without increasing their risk of complications or the treatment impacting their quality of life So what will your New Year's resolution be to make 2015 a year of real impact, whether by advancing science or applying the findings published in journals such as Contact Lens and Anterior Eye to make a real difference to your patients’ lives?
Resumo:
In this paper we present F LQ, a quadratic complexity bound on the values of the positive roots of polynomials. This bound is an extension of FirstLambda, the corresponding linear complexity bound and, consequently, it is derived from Theorem 3 below. We have implemented FLQ in the Vincent-Akritas-Strzeboński Continued Fractions method (VAS-CF) for the isolation of real roots of polynomials and compared its behavior with that of the theoretically proven best bound, LM Q. Experimental results indicate that whereas F LQ runs on average faster (or quite faster) than LM Q, nonetheless the quality of the bounds computed by both is about the same; moreover, it was revealed that when VAS-CF is run on our benchmark polynomials using F LQ, LM Q and min(F LQ, LM Q) all three versions run equally well and, hence, it is inconclusive which one should be used in the VAS-CF method.
Resumo:
Abstract (provisional): Background Failing a high-stakes assessment at medical school is a major event for those who go through the experience. Students who fail at medical school may be more likely to struggle in professional practice, therefore helping individuals overcome problems and respond appropriately is important. There is little understanding about what factors influence how individuals experience failure or make sense of the failing experience in remediation. The aim of this study was to investigate the complexity surrounding the failure experience from the student’s perspective using interpretative phenomenological analysis (IPA). Methods The accounts of 3 medical students who had failed final re-sit exams, were subjected to in-depth analysis using IPA methodology. IPA was used to analyse each transcript case-by-case allowing the researcher to make sense of the participant’s subjective world. The analysis process allowed the complexity surrounding the failure to be highlighted, alongside a narrative describing how students made sense of the experience. Results The circumstances surrounding students as they approached assessment and experienced failure at finals were a complex interaction between academic problems, personal problems (specifically finance and relationships), strained relationships with friends, family or faculty, and various mental health problems. Each student experienced multi-dimensional issues, each with their own individual combination of problems, but experienced remediation as a one-dimensional intervention with focus only on improving performance in written exams. What these students needed to be included was help with clinical skills, plus social and emotional support. Fear of termination of the their course was a barrier to open communication with staff. Conclusions These students’ experience of failure was complex. The experience of remediation is influenced by the way in which students make sense of failing. Generic remediation programmes may fail to meet the needs of students for whom personal, social and mental health issues are a part of the picture.
Resumo:
Online enquiry communities such as Question Answering (Q&A) websites allow people to seek answers to all kind of questions. With the growing popularity of such platforms, it is important for community managers to constantly monitor the performance of their communities. Although different metrics have been proposed for tracking the evolution of such communities, maturity, the process in which communities become more topic proficient over time, has been largely ignored despite its potential to help in identifying robust communities. In this paper, we interpret community maturity as the proportion of complex questions in a community at a given time. We use the Server Fault (SF) community, a Question Answering (Q&A) community of system administrators, as our case study and perform analysis on question complexity, the level of expertise required to answer a question. We show that question complexity depends on both the length of involvement and the level of contributions of the users who post questions within their community. We extract features relating to askers, answerers, questions and answers, and analyse which features are strongly correlated with question complexity. Although our findings highlight the difficulty of automatically identifying question complexity, we found that complexity is more influenced by both the topical focus and the length of community involvement of askers. Following the identification of question complexity, we define a measure of maturity and analyse the evolution of different topical communities. Our results show that different topical communities show different maturity patterns. Some communities show a high maturity at the beginning while others exhibit slow maturity rate. Copyright 2013 ACM.
Resumo:
A modern electronic nonlinearity equalizer (NLE) based on inverse Volterra series transfer function (IVSTF) with reduced complexity is applied on coherent optical orthogonal frequency-division multiplexing (CO-OFDM) signals for next-generation long- and ultra-long-haul applications. The OFDM inter-subcarrier crosstalk effects are explored thoroughly using the IVSTF-NLE and compared with the case of linear equalization (LE) for transmission distances of up to 7000 km. © 2013 IEEE.