928 resultados para 3-LEVEL SYSTEMS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The safety risk management process describes the systematic application of management policies, procedures and practices to the activities of communicating, consulting, establishing the context, and identifying, analysing, evaluating, treating, monitoring and reviewing risk. This process is undertaken to provide assurances that the risks of a particular unmanned aircraft system activity have been managed to an acceptable level. The safety risk management process and its outcomes form part of the documented safety case necessary to obtain approvals for unmanned aircraft system operations. It also guides the development of an organisation’s operations manual and is a primary component of an organisation’s safety management system. The aim of this chapter is to provide existing risk practitioners with a high level introduction to some of the unique issues and challenges in the application of the safety risk management process to unmanned aircraft systems. The scope is limited to safety risks associated with the operation of unmanned aircraft in the civil airspace system and over inhabited areas. The structure of the chapter is based on the safety risk management process as defined by the international risk management standard ISO 31000:2009 and draws on aviation safety resources provided by International Civil Aviation Organization, the Federal Aviation Administration and U.S. Department of Defense. References to relevant aviation safety regulations, programs of research and fielded systems are also provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the problem of constructing randomized online algorithms for the Metrical Task Systems (MTS) problem on a metric δ against an oblivious adversary. Restricting our attention to the class of “work-based” algorithms, we provide a framework for designing algorithms that uses the technique of regularization. For the case when δ is a uniform metric, we exhibit two algorithms that arise from this framework, and we prove a bound on the competitive ratio of each. We show that the second of these algorithms is ln n + O(loglogn) competitive, which is the current state-of-the art for the uniform MTS problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The health system is one sector dealing with a deluge of complex data. Many healthcare organisations struggle to utilise these volumes of health data effectively and efficiently. Also, there are many healthcare organisations, which still have stand-alone systems, not integrated for management of information and decision-making. This shows, there is a need for an effective system to capture, collate and distribute this health data. Therefore, implementing the data warehouse concept in healthcare is potentially one of the solutions to integrate health data. Data warehousing has been used to support business intelligence and decision-making in many other sectors such as the engineering, defence and retail sectors. The research problem that is going to be addressed is, "how can data warehousing assist the decision-making process in healthcare". To address this problem the researcher has narrowed an investigation focusing on a cardiac surgery unit. This research used the cardiac surgery unit at the Prince Charles Hospital (TPCH) as the case study. The cardiac surgery unit at TPCH uses a stand-alone database of patient clinical data, which supports clinical audit, service management and research functions. However, much of the time, the interaction between the cardiac surgery unit information system with other units is minimal. There is a limited and basic two-way interaction with other clinical and administrative databases at TPCH which support decision-making processes. The aims of this research are to investigate what decision-making issues are faced by the healthcare professionals with the current information systems and how decision-making might be improved within this healthcare setting by implementing an aligned data warehouse model or models. As a part of the research the researcher will propose and develop a suitable data warehouse prototype based on the cardiac surgery unit needs and integrating the Intensive Care Unit database, Clinical Costing unit database (Transition II) and Quality and Safety unit database [electronic discharge summary (e-DS)]. The goal is to improve the current decision-making processes. The main objectives of this research are to improve access to integrated clinical and financial data, providing potentially better information for decision-making for both improved from the questionnaire and by referring to the literature, the results indicate a centralised data warehouse model for the cardiac surgery unit at this stage. A centralised data warehouse model addresses current needs and can also be upgraded to an enterprise wide warehouse model or federated data warehouse model as discussed in the many consulted publications. The data warehouse prototype was able to be developed using SAS enterprise data integration studio 4.2 and the data was analysed using SAS enterprise edition 4.3. In the final stage, the data warehouse prototype was evaluated by collecting feedback from the end users. This was achieved by using output created from the data warehouse prototype as examples of the data desired and possible in a data warehouse environment. According to the feedback collected from the end users, implementation of a data warehouse was seen to be a useful tool to inform management options, provide a more complete representation of factors related to a decision scenario and potentially reduce information product development time. However, there are many constraints exist in this research. For example the technical issues such as data incompatibilities, integration of the cardiac surgery database and e-DS database servers and also, Queensland Health information restrictions (Queensland Health information related policies, patient data confidentiality and ethics requirements), limited availability of support from IT technical staff and time restrictions. These factors have influenced the process for the warehouse model development, necessitating an incremental approach. This highlights the presence of many practical barriers to data warehousing and integration at the clinical service level. Limitations included the use of a small convenience sample of survey respondents, and a single site case report study design. As mentioned previously, the proposed data warehouse is a prototype and was developed using only four database repositories. Despite this constraint, the research demonstrates that by implementing a data warehouse at the service level, decision-making is supported and data quality issues related to access and availability can be reduced, providing many benefits. Output reports produced from the data warehouse prototype demonstrated usefulness for the improvement of decision-making in the management of clinical services, and quality and safety monitoring for better clinical care. However, in the future, the centralised model selected can be upgraded to an enterprise wide architecture by integrating with additional hospital units’ databases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a debate in the research literature whether to view police misconduct and crime as acts of individuals perceived as 'rotten apples' or as an indication of systems failure in the police force. Based on an archival analysis of court cases where police employees were prosecuted, this paper attempts to explore the extent of rotten apples versus systems failure in the police. Exploratory research of 57 prosecuted police officers in Norway indicate that there were more rotten apple cases than system failure cases. The individual failures seem to be the norm rather than the exception of ethical breaches, therefore enhancing the rotten apple theory. However as exploratory research, police crime may still be explained at the organizational level as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flow regime transition criteria are of practical importance for two-phase flow analyses at reduced gravity conditions. Here, flow regime transition criteria which take the friction pressure loss effect into account were studied in detail. Criteria at reduced gravity conditions were developed by extending an existing model with various experimental datasets taken at microgravity conditions showed satisfactory agreement. Sample computations of the model were performed at various gravity conditions, such as 0.196, 1.62, 3.71, and 9.81 m/s2 corresponding to micro-gravity and lunar, Martian and Earth surface gravity, respectively. It was found that the effect of gravity on bubbly-slug and slug-annular (churn) transitions in a two-phase flow system was more pronounced at low liquid flow conditions, whereas the gravity effect could be ignored at high mixture volumetric flux conditions. While for the annular flow transitions due to flow reversal and onset of dropset entrainment, higher superficial gas velocity was obtained at higher gravity level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the long term, with development of skill, knowledge, exposure and confidence within the engineering profession, rigorous analysis techniques have the potential to become a reliable and far more comprehensive method for design and verification of the structural adequacy of OPS, write Nimal J Perera, David P Thambiratnam and Brian Clark. This paper explores the potential to enhance operator safety of self-propelled mechanical plant subjected to roll over and impact of falling objects using the non-linear and dynamic response simulation capabilities of analytical processes to supplement quasi-static testing methods prescribed in International and Australian Codes of Practice for bolt on Operator Protection Systems (OPS) that are post fitted. The paper is based on research work carried out by the authors at the Queensland University of Technology (QUT) over a period of three years by instrumentation of prototype tests, scale model tests in the laboratory and rigorous analysis using validated Finite Element (FE) Models. The FE codes used were ABAQUS for implicit analysis and LSDYNA for explicit analysis. The rigorous analysis and dynamic simulation technique described in the paper can be used to investigate the structural response due to accident scenarios such as multiple roll over, impact of multiple objects and combinations of such events and thereby enhance the safety and performance of Roll Over and Falling Object Protection Systems (ROPS and FOPS). The analytical techniques are based on sound engineering principles and well established practice for investigation of dynamic impact on all self propelled vehicles. They are used for many other similar applications where experimental techniques are not feasible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pedestrians’ use of mp3 players or mobile phones can pose the risk of being hit by motor vehicles. We present an approach for detecting a crash risk level using the computing power and the microphone of mobile devices that can be used to alert the user in advance of an approaching vehicle so as to avoid a crash. A single feature extractor classifier is not usually able to deal with the diversity of risky acoustic scenarios. In this paper, we address the problem of detection of vehicles approaching a pedestrian by a novel, simple, non resource intensive acoustic method. The method uses a set of existing statistical tools to mine signal features. Audio features are adaptively thresholded for relevance and classified with a three component heuristic. The resulting Acoustic Hazard Detection (AHD) system has a very low false positive detection rate. The results of this study could help mobile device manufacturers to embed the presented features into future potable devices and contribute to road safety.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although in the late 1990s there was much discussion as to whether the idea of information literacy was necessary or had longevity, global interest in the phenomenon has increased rather than diminished. In the midst of all this activity, what has happened to the way in which we interpret the idea of information literacy in the last decade or more? The label of information literacy has certainly become widely applied, especially to library based programs and remains more popular in formal learning environments.Ultimately information literacy is about peoples’ experience of using information wherever they happen to be. Information literacy is about people interacting, engaging, working with information in many contexts, either individually or in community. Emerging technologies may transform the kinds of information available and how it is engaged with. Nevertheless, we continue to need to understand the experience of information use in order to support people in their information environments. We continue to need to develop programs which reflect and enhance peoples’ experiences of using information to learn in ever widening and more complex settings (Bruce, 2008; Bruce & Hughes, 2010).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Is it possible to control identities using performance management systems (PMSs)? This paper explores the theoretical fusion of management accounting and identity studies, providing a synthesised view of control, PMSs and identification processes. It argues that the effective use of PMSs generates a range of obtrusive mechanistic and unobtrusive organic controls that mediate identification processes to achieve a high level of identity congruency between individuals and collectives—groups and organisations. This paper contends that mechanistic control of PMSs provides sensebreaking effects and also creates structural conditions for sensegiving in top-down identification processes. These processes encourage individuals to continue the bottom-up processes of sensemaking, enacting identity and constructing identity narratives. Over time, PMS activities and conversations periodically mediate several episode(s) of identification to connect past, current and future identities. To explore this relationship, the dual locus of control—collectives and individuals—is emphasised to explicate their interplay. This multidisciplinary approach contributes to explaining the multidirectional effects of PMSs in obtrusive as well as unobtrusive ways, in order to control the nature of collectives and individuals in organisations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a method for automatic terrain classification, using a cheap monocular camera in conjunction with a robot’s stall sensor. A first step is to have the robot generate a training set of labelled images. Several techniques are then evaluated for preprocessing the images, reducing their dimensionality, and building a classifier. Finally, the classifier is implemented and used online by an indoor robot. Results are presented, demonstrating an increased level of autonomy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This PhD study examines whether water allocation becomes more productive when it is re-allocated from 'low' to 'high' efficient alternative uses in village irrigation systems (VISs) in Sri Lanka. Reservoir-based agriculture is a collective farming economic activity, which inter-sectoral allocation of water is assumed to be inefficient due to market imperfections and weak user rights. Furthermore, the available literature shows that a „head-tail syndrome. is the most common issue for intra-sectoral water management in „irrigation. agriculture. This research analyses the issue of water allocation by using primary data collected from two surveys of 460 rice farmers and 325 fish farming groups in two administrative districts in Sri Lanka. Technical efficiency estimates are undertaken for both rice farming and culture-based fisheries (CBF) production. The equi-marginal principle is applied for inter and intra-sectoral allocation of water. Welfare benefits of water re-allocation are measured through consumer surplus estimation. Based on these analyses, the overall findings of the thesis can be summarised as follows. The estimated mean technical efficiency (MTE) for rice farming is 73%. For CBF production, the estimated MTE is 33%. The technical efficiency distribution is skewed to the left for rice farming, while it skewed to the right for CBF production. The results show that technical efficiency of rice farming can be improved by formalising transferability of land ownership and, therefore, water user rights by enhancing the institutional capacity of Farmer Organisations (FOs). Other effective tools for improving technical efficiency of CBF production are strengthening group stability of CBF farmers, improving the accessibility of official consultation, and attracting independent investments. Inter-sectoral optimal allocation shows that the estimated inefficient volume of water in rice farming, which can be re-allocated for CBF production, is 32%. With the application of successive policy instruments (e.g., a community transferable quota system and promoting CBF activities), there is potential for a threefold increase in marginal value product (MVP) of total reservoir water in VISs. The existing intra-sectoral inefficient volume of water use in tail-end fields and head-end fields can potentially be removed by reducing water use by 10% and 23% respectively and re-allocating this to middle fields. This re-allocation may enable a twofold increase in MVP of water used in rice farming without reducing the existing rice output, but will require developing irrigation practices to facilitate this re-allocation. Finally, the total productivity of reservoir water can be increased by responsible village level institutions and primary level stakeholders (i.e., co-management) sharing responsibility of water management, while allowing market forces to guide the efficient re-allocation decisions. This PhD has demonstrated that instead of farmers allocating water between uses haphazardly, they can now base their decisions on efficient water use with a view to increasing water productivity. Such an approach, no doubt will enhance farmer incomes and community welfare.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Texas Department of Transportation (TxDOT) is concerned about the widening gap between preservation needs and available funding. Funding levels are not adequate to meet the preservation needs of the roadway network; therefore projects listed in the 4-Year Pavement Management Plan must be ranked to determine which projects should be funded now and which can be postponed until a later year. Currently, each district uses locally developed methods to prioritize projects. These ranking methods have relied on less formal qualitative assessments based on engineers’ subjective judgment. It is important for TxDOT to have a 4-Year Pavement Management Plan that uses a transparent, rational project ranking process. The objective of this study is to develop a conceptual framework that describes the development of the 4-Year Pavement Management Plan. It can be largely divided into three Steps; 1) Network-Level project screening process, 2) Project-Level project ranking process, and 3) Economic Analysis. A rational pavement management procedure and a project ranking method accepted by districts and the TxDOT administration will maximize efficiency in budget allocations and will potentially help improve pavement condition. As a part of the implementation of the 4-Year Pavement Management Plan, the Network-Level Project Screening (NLPS) tool including the candidate project identification algorithm and the preliminary project ranking matrix was developed. The NLPS has been used by the Austin District Pavement Engineer (DPE) to evaluate PMIS (Pavement Management Information System) data and to prepare a preliminary list of candidate projects for further evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organizations today engage in various forms of alliances to manage their existing business processes or to diversify into new processes to sustain their competitive positions. Many of today’s alliances use the IT resources as their backbone. The results of these alliances are collaborative organizational structures with little or no ownership stakes between the parties. The emergence of Web 2.0 tools is having a profound effect on the nature and form of these alliance structures. These alliances heavily depend on and make radical use of the IT resources in a collaborative environment. This situation requires a deeper understanding of the governance of these IT resources to ensure the sustainability of the collaborative organizational structures. This study first suggests the types of IT governance structures required for collaborative organizational structures. Semi-structured interviews with senior executives who operate in such alliances reveal that co-created IT governance structures are necessary. Such structures include co-created IT-steering committees, co-created operational committees, and inter-organizational performance management and communication systems. The findings paved the way for the development of a model for understanding approaches to governing IT and evaluating the effectiveness for such governance mechanisms in today’s IT dependent alliances. This study presents a sustainable IT-related capabilities approach to assessing the effectiveness of suggested IT governance structures for collaborative alliances. The findings indicate a favourable association between organizations IT governance efforts and their ability to sustain their capabilities to leverage their IT resources. These IT-related capabilities also relate to measures business value at the process and firm level. This makes it possible to infer that collaborative organizations’ IT governance efforts contribute to business value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the substantial investment in information technology (IT), and the significant impact IT has on organizational success, organizations consume considerable resources to manage acquisition and use of their IT resources. While various arguments proposed suggest which IT governance arrangements may work best, our understanding of the effectiveness of such initiatives is limited. We examine the relationship between the effectiveness of IT steering committee driven IT governance initiatives and firm's IT management and IT infrastructure related capabilities. We further propose that firm's ITrelated capabilities generated through IT governance initiatives should improve its business processes and firm-level performance. We test these relationships empirically by a field survey. Results suggest that firms' effectiveness of IT steering committee driven IT governance initiatives positively relates to the level of their IT-related capabilities. We also found positive relationships between IT-related capabilities and internal process-level performance. Our results also support that improvement in internal process-level performance positively relates to improvement in customer service and firm-level performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective quantification of three-dimensional kinematics during different functional and occupational tasks is now more in demand than ever. The introduction of new generation of low-cost passive motion capture systems from a number of manufacturers has made this technology accessible for teaching, clinical practice and in small/medium industry. Despite the attractive nature of these systems, their accuracy remains unproved in independent tests. We assessed static linear accuracy, dynamic linear accuracy and compared gait kinematics from a Vicon MX20 system to a Natural Point OptiTrack system. In all experiments data were sampled simultaneously. We identified both systems perform excellently in linear accuracy tests with absolute errors not exceeding 1%. In gait data there was again strong agreement between the two systems in sagittal and coronal plane kinematics. Transverse plane kinematics differed by up to 3 at the knee and hip, which we attributed to the impact of soft tissue artifact accelerations on the data. We suggest that low-cost systems are comparably accurate to their high-end competitors and offer a platform with accuracy acceptable in research for laboratories with a limited budget.