870 resultados para Integrated risk management
Resumo:
This paper reviews the literature of construction risk modelling and assessment. It also reviews the real practice of risk assessment. The review resulted in significant results, summarised as follows. There has been a major shift in risk perception from an estimation variance into a project attribute. Although the Probability–Impact risk model is prevailing, substantial efforts are being put to improving it reflecting the increasing complexity of construction projects. The literature lacks a comprehensive assessment approach capable of capturing risk impact on different project objectives. Obtaining a realistic project risk level demands an effective mechanism for aggregating individual risk assessments. The various assessment tools suffer from low take-up; professionals typically rely on their experience. It is concluded that a simple analytical tool that uses risk cost as a common scale and utilises professional experience could be a viable option to facilitate closing the gap between theory and practice of risk assessment.
Resumo:
Despite its huge potential in risk analysis, the Dempster–Shafer Theory of Evidence (DST) has not received enough attention in construction management. This paper presents a DST-based approach for structuring personal experience and professional judgment when assessing construction project risk. DST was innovatively used to tackle the problem of lacking sufficient information through enabling analysts to provide incomplete assessments. Risk cost is used as a common scale for measuring risk impact on the various project objectives, and the Evidential Reasoning algorithm is suggested as a novel alternative for aggregating individual assessments. A spreadsheet-based decision support system (DSS) was devised to facilitate the proposed approach. Four case studies were conducted to examine the approach's viability. Senior managers in four British construction companies tried the DSS and gave very promising feedback. The paper concludes that the proposed methodology may contribute to bridging the gap between theory and practice of construction risk assessment.
Resumo:
This work involves the organization and content perspectives on Enterprise Content Management (ECM) framework. The case study at the Federal University of Rio Grande do Norte was based on ECM model to analyse the information management provided by the three main administrative systems: The Integrated Management of Academic Activities (SIGAA), Integrated System of Inheritance, and Contracts Administration (SIPAC) and the Integrated System for Administration and Human Resources (SIGRH). A case study protocol was designed to provide greater reliability to research process. Four propositions were examined in order to reach the specific objectives of identification and evaluation of ECM components from UFRN perspective. The preliminary phase provided the guidelines for the data collection. In total, 75 individuals were interviewed. Interviews with four managers directly involved on systems design were recorded (average duration of 90 minutes). The 70 remaining individuals were approached in random way in UFRN s units, including teachers, administrative-technical employees and students. The results showed the presence of many ECM elements in the management of UFRN administrative information. The technological component with higher presence was "management of web content / collaboration". But initiatives of other components (e.g. email and document management) were found and are in continuous improvement. The assessment made use of eQual 4.0 to examine the effectiveness of applications under three factors: usability, quality of information and offered service. In general, the quality offered by the systems was very good and walk side by side with the obtained benefits of ECM strategy adoption in the context of the whole institution
Resumo:
Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging because of reinforcing feedbacks between multiple drivers. We conducted semistructured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. The “Hands-off” scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production under drought conditions. The “Fire management” scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared with the “Fire suppression” scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a “boundary object” to facilitate collaboration and integration of different perceptions of fire in the region. This approach also has the potential to inform decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.
Resumo:
This work aims to analyze risks related to information technology (IT) in procedures related to data migration. This is done considering ALEPH, Integrated Libray System (ILS) that migrated data to the Library Module present in the software called Sistema Integrado de Gestão de Atividades Acadêmicas (SIGAA) at the Zila Mamede Central Library at the Federal University of Rio Grande do Norte (UFRN) in Natal/Brazil. The methodological procedure used was of a qualitative exploratory research with the realization of case study at the referred library in order to better understand this phenomenon. Data collection was able once there was use of a semi-structured interview that was applied with (11) subjects that are employed at the library as well as in the Technology Superintendence at UFRN. In order to examine data Content analysis as well as thematic review process was performed. After data migration the results of the interview were then linked to both analysis units and their system register with category correspondence. The main risks detected were: data destruction; data loss; data bank communication failure; user response delay; data inconsistency and duplicity. These elements point out implication and generate disorders that affect external and internal system users and lead to stress, work duplicity and hassles. Thus, some measures were taken related to risk management such as adequate planning, central management support, and pilot test simulations. For the advantages it has reduced of: risk, occurrence of problems and possible unforeseen costs, and allows achieving organizational objectives, among other. It is inferred therefore that the risks present in data bank conversion in libraries exist and some are predictable, however, it is seen that librarians do not know or ignore and are not very worried in the identification risks in data bank conversion, their acknowledge would minimize or even extinguish them. Another important aspect to consider is the existence of few empirical research that deal specifically with this subject and thus presenting the new of new approaches in order to promote better understanding of the matter in the corporate environment of the information units
Resumo:
Mobile network coverage is traditionally provided by outdoor macro base stations, which have a long range and serve several of customers. Due to modern passive houses and tightening construction legislation, mobile network service is deteriorated in many indoor locations. Typically, solutions for indoor coverage problem are expensive and demand actions from the mobile operator. Due to these, superior solutions are constantly researched. The solution presented in this thesis is based on Small Cell technology. Small Cells are low power access nodes designed to provide voice and data services.. This thesis concentrates on a specific Small Cell solution, which is called a Pico Cell. The problem regarding Pico Cells and Small Cells in general is that they are a new technological solution for the mobile operator, and the possible problem sources and incidents are not properly mapped. The purpose of this thesis is to figure out the possible problems in the Pico Cell deployment and how they could be solved within the operator’s incident management process. The research in the thesis is carried out with a literature research and a case study. The possible problems are investigated through lab testing. Pico Cell automated deployment process was tested in the lab environment and its proper functionality is confirmed. The related network elements were also tested and examined, and the emerged problems are resolvable. Operators existing incident management process can be used for Pico Cell troubleshooting with minor updates. Certain pre-requirements have to be met before Pico Cell deployment can be considered. The main contribution of this thesis is the Pico Cell integrated incident management process. The presented solution works in theory and solves the problems found during the lab testing. The limitations in the customer service level were solved by adding the necessary tools and by designing a working question pattern. Process structures for automated network discovery and pico specific radio parameter planning were also added for the mobile network management layer..
Resumo:
This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.
Resumo:
Hendra virus (HeV) was first described in 1994 in an outbreak of acute and highly lethal disease in horses and humans in Australia. Equine cases continue to be diagnosed periodically, yet the predisposing factors for infection remain unclear. We undertook an analysis of equine submissions tested for HeV by the Queensland government veterinary reference laboratory over a 20-year period to identify and investigate any patterns. We found a marked increase in testing from July 2008, primarily reflecting a broadening of the HeV clinical case definition. Peaks in submissions for testing, and visitations to the Government HeV website, were associated with reported equine incidents. Significantly differing between-year HeV detection rates in north and south Queensland suggest a fundamental difference in risk exposure between the two regions. The statistical association between HeV detection and stockhorse type may suggest that husbandry is a more important risk determinant than breed per se. The detection of HeV in horses with neither neurological nor respiratory signs poses a risk management challenge for attending veterinarians and laboratory staff, reinforcing animal health authority recommendations that appropriate risk management strategies be employed for all sick horses, and by anyone handling sick horses or associated biological samples.
Resumo:
SUMMARY Hendra virus (HeV) was first described in 1994 in an outbreak of acute and highly lethal disease in horses and humans in Australia. Equine cases continue to be diagnosed periodically, yet the predisposing factors for infection remain unclear. We undertook an analysis of equine submissions tested for HeV by the Queensland government veterinary reference laboratory over a 20-year period to identify and investigate any patterns. We found a marked increase in testing from July 2008, primarily reflecting a broadening of the HeV clinical case definition. Peaks in submissions for testing, and visitations to the Government HeV website, were associated with reported equine incidents. Significantly differing between-year HeV detection rates in north and south Queensland suggest a fundamental difference in risk exposure between the two regions. The statistical association between HeV detection and stockhorse type may suggest that husbandry is a more important risk determinant than breed per se. The detection of HeV in horses with neither neurological nor respiratory signs poses a risk management challenge for attending veterinarians and laboratory staff, reinforcing animal health authority recommendations that appropriate risk management strategies be employed for all sick horses, and by anyone handling sick horses or associated biological samples.
Resumo:
There are enormous benefits for any organisation from practising sound records management. In the context of a public university, the importance of good records management includes: facilitating the achievement the university’s mandate; enhancing efficiency of the university; maintaining a reliable institutional memory; promoting trust; responding to an audit culture; enhancing university competitiveness; supporting the university’s fiduciary duty; demonstrating transparency and accountability; and fighting corruption. Records scholars and commentators posit that effective recordkeeping is an essential underpinning of good governance. Although there is a portrayal of positive correlation, recordkeeping struggles to get the same attention as that given to the governance. Evidence abounds of cases of neglect of recordkeeping in universities and other institutions in Sub-Saharan Africa. The apparent absence of sound recordkeeping provided a rationale for revisiting some universities in South Africa and Malawi in order to critically explore the place of recordkeeping in an organisation’s strategy in order to develop an alternative framework for managing records and documents in an era where good governance is a global agenda. The research is a collective case study in which multiple cases are used to critically explore the relationship between recordkeeping and governance. As qualitative research that belongs in the interpretive tradition of enquiry, it is not meant to suggest prescriptive solutions to general recordkeeping problems but rather to provide an understanding of the challenges and opportunities that arise in managing records and documents in the world of governance, audit and risk. That is: what goes on in the workplace; what are the problems; and what alternative approaches might address any existing problem situations. Research findings show that some institutions are making good use of their governance structures and other drivers for recordkeeping to put in place sound recordkeeping systems. Key governance structures and other drivers for recordkeeping identified include: laws and regulations; governing bodies; audit; risk; technology; reforms; and workplace culture. Other institutions are not managing their records and documents well despite efforts to improve their governance systems. They lack recordkeeping capacity. Areas that determine recordkeeping capacity include: availability of records management policy; capacity for digital records; availability of a records management unit; senior management support; level of education and training of records management staff; and systems and procedures for storage, retrieval and dispositions of records. Although this research reveals that the overall recordkeeping in the selected countries has slightly improved compared with the situation other researchers found a decade ago, it remains unsatisfactory and disjointed from governance. The study therefore proposes governance recordkeeping as an approach to managing records and documents in the world of governance, audit and risk. The governance recordkeeping viewpoint considers recordkeeping as a governance function that should be treated in the same manner as other governance functions such as audit and risk management. Additionally, recordkeeping and governance should be considered as symbiotic elements of a strategy. A strategy that neglects recordkeeping may not fulfil the organisation’s objectives effectively.
Resumo:
Western flower thrips (WFT), Frankliniella occidentalis (Pergande), is an important pest of vegetable crops worldwide and has developed resistance to many insecticides. The predatory mites Neoseiulus (=Amblyseius) cucumeris (Oudemans), the entomopathogenic fungus Metarhizium anisopliae (Metsch.), and an insecticide (imidacloprid) were tested for their efficacy to reduce WFT population density and damage to French bean (Phaseolus vulgaris L.) pods under field conditions in two planting periods. Metarhizium anisopliae was applied as a foliar spray weekly at a rate of one litre spray volume per plot while imidacloprid was applied as a soil drench every two weeks at a rate of two litres of a mixture of water and imidacloprid per m(2). Neoseiulus cucumeris was released every two weeks on plant foliage at a rate of three mites per plant. Single and combined treatment applications reduced WFT population density by at least three times and WFT damage to French bean pods by at least 1.7 times compared with untreated plots. The benefit-cost ratios in management of WFT were profitable with highest returns realized on imidacloprid treated plots. The results indicate that M. anisopliae, N. cucumeris, and imidacloprid have the potential for use in developing an integrated pest management program against WFT on French beans.
Resumo:
This work involves the organization and content perspectives on Enterprise Content Management (ECM) framework. The case study at the Federal University of Rio Grande do Norte was based on ECM model to analyse the information management provided by the three main administrative systems: The Integrated Management of Academic Activities (SIGAA), Integrated System of Inheritance, and Contracts Administration (SIPAC) and the Integrated System for Administration and Human Resources (SIGRH). A case study protocol was designed to provide greater reliability to research process. Four propositions were examined in order to reach the specific objectives of identification and evaluation of ECM components from UFRN perspective. The preliminary phase provided the guidelines for the data collection. In total, 75 individuals were interviewed. Interviews with four managers directly involved on systems design were recorded (average duration of 90 minutes). The 70 remaining individuals were approached in random way in UFRN s units, including teachers, administrative-technical employees and students. The results showed the presence of many ECM elements in the management of UFRN administrative information. The technological component with higher presence was "management of web content / collaboration". But initiatives of other components (e.g. email and document management) were found and are in continuous improvement. The assessment made use of eQual 4.0 to examine the effectiveness of applications under three factors: usability, quality of information and offered service. In general, the quality offered by the systems was very good and walk side by side with the obtained benefits of ECM strategy adoption in the context of the whole institution
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Civil e Ambiental, 2015.
Resumo:
This work aims to analyze risks related to information technology (IT) in procedures related to data migration. This is done considering ALEPH, Integrated Libray System (ILS) that migrated data to the Library Module present in the software called Sistema Integrado de Gestão de Atividades Acadêmicas (SIGAA) at the Zila Mamede Central Library at the Federal University of Rio Grande do Norte (UFRN) in Natal/Brazil. The methodological procedure used was of a qualitative exploratory research with the realization of case study at the referred library in order to better understand this phenomenon. Data collection was able once there was use of a semi-structured interview that was applied with (11) subjects that are employed at the library as well as in the Technology Superintendence at UFRN. In order to examine data Content analysis as well as thematic review process was performed. After data migration the results of the interview were then linked to both analysis units and their system register with category correspondence. The main risks detected were: data destruction; data loss; data bank communication failure; user response delay; data inconsistency and duplicity. These elements point out implication and generate disorders that affect external and internal system users and lead to stress, work duplicity and hassles. Thus, some measures were taken related to risk management such as adequate planning, central management support, and pilot test simulations. For the advantages it has reduced of: risk, occurrence of problems and possible unforeseen costs, and allows achieving organizational objectives, among other. It is inferred therefore that the risks present in data bank conversion in libraries exist and some are predictable, however, it is seen that librarians do not know or ignore and are not very worried in the identification risks in data bank conversion, their acknowledge would minimize or even extinguish them. Another important aspect to consider is the existence of few empirical research that deal specifically with this subject and thus presenting the new of new approaches in order to promote better understanding of the matter in the corporate environment of the information units
Resumo:
Part 17: Risk Analysis