238 resultados para Resource-Based View
Resumo:
This study contributes to the understanding of the contribution of financial reserves to sustaining nonprofit organisations. Recognising the limited recent Australian research in the area of nonprofit financial vulnerability, it specifically examines financial reserves held by signatories to the Code of Conduct of the Australian Council for International Development (ACFID) for the years 2006 to 2010. As this period includes the Global Financial Crisis, it presents a unique opportunity to observe the role of savings in a period of heightened financial threats to sustainability. The need for nonprofit entities to maintain reserves, while appearing intuitively evident, is neither unanimously accepted nor supported by established theoretic constructs. Some early frameworks attempt to explain the savings behaviour of nonprofit organisations and its role in organisational sustainability. Where researchers have considered the issue, its treatment has usually been either purely descriptive or alternatively, peripheral to a broader attempt to predict financial vulnerability. Given the importance of nonprofit entities to civil society, the sustainability of these organisations during times of economic contraction, such as the recent Global Financial Crisis, is a significant issue. Widespread failure of nonprofits, or even the perception of failure, will directly affect, not only those individuals who access their public goods and services, but would also have impacts on public confidence in both government and the sectors’ ability to manage and achieve their purpose. This study attempts to ‘shine a light’ on the paradox inherent in considering nonprofit savings. On the one hand, a public prevailing view is that nonprofit organisations should not hoard and indeed, should spend all of their funds on the direct achievement of their purposes. Against this, is the commonsense need for a financial buffer if only to allow for the day to day contingencies of pay rises and cost increases. At the entity level, the extent of reserves accumulated (or not) is an important consideration for Management Boards. The general public are also interested in knowing the level of funds held by nonprofits as a measure of both their commitment to purpose and as an indicator of their effectiveness. There is a need to communicate the level and prevalence of reserve holdings, balancing the prudent hedging of uncertainty against a sense of resource hoarding in the mind of donors. Finally, funders (especially governments) are interested in knowing the appropriate level of reserves to facilitate the ongoing sustainability of the sector. This is particularly so where organisations are involved in the provision of essential public goods and services. At a scholarly level, the study seeks to provide a rationale for this behaviour within the context of appropriate theory. At a practical level, the study seeks to give an indication of the drivers for savings, the actual levels of reserves held within the sector studied, as well as an indication as to whether the presence of reserves did mitigate the effects of financial turmoil during the Global Financial Crisis. The argument is not whether there is a need to ensure sustainability of nonprofits, but rather how it is to be done and whether the holding of reserves (net assets) is an essential element is achieving this. While the study offers no simple answers, it does appear that the organisations studied present as two groups, the ‘savers’ who build reserves and keep ‘money in the bank’ and ‘spender-delivers’ who put their resources ‘on the ground’. To progress an understanding of this dichotomy, the study suggests a need to move from its current approach to one which needs to more closely explore accounts based empirical donor attitude and nonprofit Management Board strategy.
Resumo:
Detailed representations of complex flow datasets are often difficult to generate using traditional vector visualisation techniques such as arrow plots and streamlines. This is particularly true when the flow regime changes in time. Texture-based techniques, which are based on the advection of dense textures, are novel techniques for visualising such flows. We review two popular texture based techniques and their application to flow datasets sourced from active research projects. The techniques investigated were Line integral convolution (LIC) [1], and Image based flow visualisation (IBFV) [18]. We evaluated these and report on their effectiveness from a visualisation perspective. We also report on their ease of implementation and computational overheads.
Resumo:
Universities often struggle to satisfy students’ need for feedback. This is an area where student satisfaction with courses of study can be low. Yet it is clear that one of the properties of good teaching is giving the highest quality feedback on student work. The term ‘feedback’ though is most commonly associated with summative assessment given by a teacher after work is completed. The student can often be a passive participant in the process. This paper looks at the implementation of a web based interactive scenario completed by students prior to summative assessment. It requires students to participate actively to develop and improve their legal problem solving skills. Traditional delivery of legal education focuses on print and an instructor who conveys the meaning of the written word to students. Today, mixed modes of teaching are often preferred and they can provide enhanced opportunities for feeding forward with greater emphasis on what students do. Web based activities allow for flexible delivery; they are accessible off campus, at a time that suits the student and may be completed by students at their own pace. This paper reports on an online interactive activity which provides valuable formative feedback necessary to allow for successful completion of a final problem solving assignment. It focuses on how the online activity feeds forward and contributes to the development of legal problem solving skills. Introduction to Law is a unit designed and introduced for completion by undergraduate students from faculties other than law but is focused most particularly on students enrolled in the Bachelor of Entertainment Industries degree, a joint initiative of the faculties of Creative Industries, Business and Law at the Queensland University of Technology in Australia. The final (and major) assessment for the unit is an assignment requiring students to explain the legal consequences of particular scenarios. A number of cost effective web based interactive scenarios have been developed to support the unit’s classroom activities. The tool commences with instruction on problem solving method. Students then view the stimulus which is a narrative produced in the form of a music video clip. A series of questions are posed which guide students through the process and they can compare their responses with sample answers provided. The activity clarifies the problem solving method and expectations for the summative assessment and allows students to practise the skill. The paper reports on the approach to teaching and learning taken in the unit including the design process and implementation of the activity. It includes an evaluation of the activity with respect to its effectiveness as a tool to feed forward and reflects on the implications for the teaching of law in higher education.
The backfilled GEI : a cross-capture modality gait feature for frontal and side-view gait recognition
Resumo:
In this paper, we propose a novel direction for gait recognition research by proposing a new capture-modality independent, appearance-based feature which we call the Back-filled Gait Energy Image (BGEI). It can can be constructed from both frontal depth images, as well as the more commonly used side-view silhouettes, allowing the feature to be applied across these two differing capturing systems using the same enrolled database. To evaluate this new feature, a frontally captured depth-based gait dataset was created containing 37 unique subjects, a subset of which also contained sequences captured from the side. The results demonstrate that the BGEI can effectively be used to identify subjects through their gait across these two differing input devices, achieving rank-1 match rate of 100%, in our experiments. We also compare the BGEI against the GEI and GEV in their respective domains, using the CASIA dataset and our depth dataset, showing that it compares favourably against them. The experiments conducted were performed using a sparse representation based classifier with a locally discriminating input feature space, which show significant improvement in performance over other classifiers used in gait recognition literature, achieving state of the art results with the GEI on the CASIA dataset.
Resumo:
The feasibility of using an in-hardware implementation of a genetic algorithm (GA) to solve the computationally expensive travelling salesman problem (TSP) is explored, especially in regard to hardware resource requirements for problem and population sizes. We investigate via numerical experiments whether a small population size might prove sufficient to obtain reasonable quality solutions for the TSP, thereby permitting relatively resource efficient hardware implementation on field programmable gate arrays (FPGAs). Software experiments on two TSP benchmarks involving 48 and 532 cities were used to explore the extent to which population size can be reduced without compromising solution quality, and results show that a GA allowed to run for a large number of generations with a smaller population size can yield solutions of comparable quality to those obtained using a larger population. This finding is then used to investigate feasible problem sizes on a targeted Virtex-7 vx485T-2 FPGA platform via exploration of hardware resource requirements for memory and data flow operations.
Resumo:
Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.
Resumo:
RatSLAM is a navigation system based on the neural processes underlying navigation in the rodent brain, capable of operating with low resolution monocular image data. Seminal experiments using RatSLAM include mapping an entire suburb with a web camera and a long term robot delivery trial. This paper describes OpenRatSLAM, an open-source version of RatSLAM with bindings to the Robot Operating System framework to leverage advantages such as robot and sensor abstraction, networking, data playback, and visualization. OpenRatSLAM comprises connected ROS nodes to represent RatSLAM’s pose cells, experience map, and local view cells, as well as a fourth node that provides visual odometry estimates. The nodes are described with reference to the RatSLAM model and salient details of the ROS implementation such as topics, messages, parameters, class diagrams, sequence diagrams, and parameter tuning strategies. The performance of the system is demonstrated on three publicly available open-source datasets.
Resumo:
The numerical solution of stochastic differential equations (SDEs) has been focused recently on the development of numerical methods with good stability and order properties. These numerical implementations have been made with fixed stepsize, but there are many situations when a fixed stepsize is not appropriate. In the numerical solution of ordinary differential equations, much work has been carried out on developing robust implementation techniques using variable stepsize. It has been necessary, in the deterministic case, to consider the "best" choice for an initial stepsize, as well as developing effective strategies for stepsize control-the same, of course, must be carried out in the stochastic case. In this paper, proportional integral (PI) control is applied to a variable stepsize implementation of an embedded pair of stochastic Runge-Kutta methods used to obtain numerical solutions of nonstiff SDEs. For stiff SDEs, the embedded pair of the balanced Milstein and balanced implicit method is implemented in variable stepsize mode using a predictive controller for the stepsize change. The extension of these stepsize controllers from a digital filter theory point of view via PI with derivative (PID) control will also be implemented. The implementations show the improvement in efficiency that can be attained when using these control theory approaches compared with the regular stepsize change strategy.
Resumo:
On average, 560 fatal run-off-road crashes occur annually in Australia and 135 in New Zealand. In addition, there are more than 14,000 run-off-road crashes causing injuries each year across both countries. In rural areas, run-off-road casualty crashes constitute 50-60% of all casualty crashes. Their severity is particularly high with more than half of those involved sustaining fatal or serious injuries. This paper reviews the existing approach to roadside hazard risk assessment, selection of clear zones and hazard treatments. It proposes a modified approach to roadside safety evaluation and management. It is a methodology based on statistical modelling of run-off-road casualty crashes, and application of locally developed crash modification factors and severity indices. Clear zones, safety barriers and other roadside design/treatment options are evaluated with a view to minimise fatal and serious injuries – the key Safe System objective. The paper concludes with a practical demonstration of the proposed approach. The paper is based on findings from a four-year Austroads research project into improving roadside safety in the Safe System context.
Resumo:
The dynamic capabilities view (DCV) focuses on renewal of firms’ strategic knowledge resources so as to sustain competitive advantage within turbulent markets. Within the context of the DCV, the focus of knowledge management (KM) is to develop the KMC through deploying knowledge governance mechanisms that are conducive to facilitating knowledge processes so as to produce superior business performance over time. The essence of KM performance evaluation is to assess how well the KMC is configured with knowledge governance mechanisms and processes that enable a firm to achieve superior performance through matching its knowledge base with market needs. However, little research has been undertaken to evaluate KM performance from the DCV perspective. This study employed a survey study design and adopted hypothesis-testing approaches to develop a capability-based KM evaluation framework (CKMEF) that upholds the basic assertions of the DCV. Under the governance of the framework, a KM index (KMI) and a KM maturity model (KMMM) were derived not only to indicate the extent to which a firm’s KM implementations fulfill its strategic objectives, and to identify the evolutionary phase of its KMC, but also to bench-mark the KMC in the research population. The research design ensured that the evaluation framework and instruments have statistical significance and good generalizabilty to be applied in the research population, namely construction firms operating in the dynamic Hong Kong construction market. The study demonstrated the feasibility of quantitatively evaluating the development of the KMC and revealing the performance heterogeneity associated with the development.
Resumo:
Our daily lives become more and more dependent upon smartphones due to their increased capabilities. Smartphones are used in various ways from payment systems to assisting the lives of elderly or disabled people. Security threats for these devices become increasingly dangerous since there is still a lack of proper security tools for protection. Android emerges as an open smartphone platform which allows modification even on operating system level. Therefore, third-party developers have the opportunity to develop kernel-based low-level security tools which is not normal for smartphone platforms. Android quickly gained its popularity among smartphone developers and even beyond since it bases on Java on top of "open" Linux in comparison to former proprietary platforms which have very restrictive SDKs and corresponding APIs. Symbian OS for example, holding the greatest market share among all smartphone OSs, was closing critical APIs to common developers and introduced application certification. This was done since this OS was the main target for smartphone malwares in the past. In fact, more than 290 malwares designed for Symbian OS appeared from July 2004 to July 2008. Android, in turn, promises to be completely open source. Together with the Linux-based smartphone OS OpenMoko, open smartphone platforms may attract malware writers for creating malicious applications endangering the critical smartphone applications and owners� privacy. In this work, we present our current results in analyzing the security of Android smartphones with a focus on its Linux side. Our results are not limited to Android, they are also applicable to Linux-based smartphones such as OpenMoko Neo FreeRunner. Our contribution in this work is three-fold. First, we analyze android framework and the Linux-kernel to check security functionalities. We survey wellaccepted security mechanisms and tools which can increase device security. We provide descriptions on how to adopt these security tools on Android kernel, and provide their overhead analysis in terms of resource usage. As open smartphones are released and may increase their market share similar to Symbian, they may attract attention of malware writers. Therefore, our second contribution focuses on malware detection techniques at the kernel level. We test applicability of existing signature and intrusion detection methods in Android environment. We focus on monitoring events on the kernel; that is, identifying critical kernel, log file, file system and network activity events, and devising efficient mechanisms to monitor them in a resource limited environment. Our third contribution involves initial results of our malware detection mechanism basing on static function call analysis. We identified approximately 105 Executable and Linking Format (ELF) executables installed to the Linux side of Android. We perform a statistical analysis on the function calls used by these applications. The results of the analysis can be compared to newly installed applications for detecting significant differences. Additionally, certain function calls indicate malicious activity. Therefore, we present a simple decision tree for deciding the suspiciousness of the corresponding application. Our results present a first step towards detecting malicious applications on Android-based devices.
Resumo:
Background: Extra corporeal membrane oxygenation (ECMO) is a complex rescue therapy used to provide cardiac and/or respiratory support for critically ill patients who have failed maximal conventional medical management. ECMO is based on a modified cardiopulmonary bypass (CPB) circuit, and can provide cardiopulmonary support for up-to several months. It can be used in a veno venous configuration for isolated respiratory failure, (VV-ECMO), or in a veno arterial configuration (VA-ECMO) where support is necessary for cardiac +/- respiratory failure. The ECMO circuit consists of five main components: large bore cannulae (access cannulae) for drainage of the venous system, and return cannulae to either the venous (in VV-ECMO) or arterial (in VA ECMO) system. An oxygenator, with a vast surface area of hollow filaments, allows addition of oxygen and removal of carbon dioxide; a centrifugal blood pump allows propulsion of blood through the circuit at upto 10 L/minute; a control module and a thermoregulatory unit, which allows for exact temperature control of the extra corporeal blood. Methods: The first successful use of ECMO for ARDS in adults occurred in 1972, and its use has become more commonplace over the last 30 years, supported by the improvement in design and biocompatibility of the equipment, which has reduced the morbidity associated with this modality. Whilst the use of ECMO in neonatal population has been supported by numerous studies, the evidence upon which ECMO was integrated into adult practice was substantially less robust. Results: Recent data, including the CESAR study (Conventional Ventilatory Support versus Extra corporeal membrane oxygenation for Severe Respiratory failure) has added a degree of evidence to the role of ECMO in such a patient population. The CESAR study analysed 180 patients, and confirmed that ECMO was associated with an improved rate of survival. More recently, ECMO has been utilized in numerous situations within the critical care area, including support in high-risk percutaneous interventions in cardiac catheter lab; the operating room, emergency department, as well in specialized inter-hospital retrieval services. The increased understanding of the risk:benefit profile of ECMO, along with a reduction in morbidity associated with its use will doubtless lead to a substantial rise in the utilisation of this modality. As with all extra-corporeal circuits, ECMO opposes the basic premises of the mammalian inflammation and coagulation cascade where blood comes into foreign circulation, both these cascades are activated. Anti-coagulation is readily dealt with through use of agents such as heparin, but the inflammatory excess, whilst less macroscopically obvious, continues un-abated. Platelet consumption and neutrophil activation occur rapidly, and the clinician is faced with balancing the need of anticoagulation for the circuit, against haemostasis in an acutely bleeding patient. Alterations in pharmacokinetics may result in inadequate levels of disease modifying therapeutics, such as antibiotics, hence paradoxically delaying recovery from conditions such as pneumonia. Key elements of nutrition and the innate immune system maysimilarly be affected. Summary: This presentation will discuss the basic features of ECMO to the nonspecialist, and review the clinical conundrum faced by the team treating these most complex cases.
Resumo:
Private data stored on smartphones is a precious target for malware attacks. A constantly changing environment, e.g. switching network connections, can cause unpredictable threats, and require an adaptive approach to access control. Context-based access control is using dynamic environmental information, including it into access decisions. We propose an "ecosystem-in-an-ecosystem" which acts as a secure container for trusted software aiming at enterprise scenarios where users are allowed to use private devices. We have implemented a proof-of-concept prototype for an access control framework that processes changes to low-level sensors and semantically enriches them, adapting access control policies to the current context. This allows the user or the administrator to maintain fine-grained control over resource usage by compliant applications. Hence, resources local to the trusted container remain under control of the enterprise policy. Our results show that context-based access control can be done on smartphones without major performance impact.
Resumo:
This report is a technical assessment of the hydrological environment of the southern Moreton Bay islands and follows the terms of reference supplied by the then Queensland Department of Natural Resources and Water. The terms of reference describe stage 1 as a condition assessment and stage 2 as an assessment of the implications of water planning scenarios on future condition. This report is the first stage of a two-stage investigation whose primary purpose is to identify and assess groundwater dependent ecosystems (GDEs) and the groundwater flow regimes necessary to support them. Within this context, the groundwaters themselves are also considered and comment made on their condition. Information provided in this report will inform an amendment to the Logan Basin Water Resource Plan to incorporate the southern Moreton Bay islands. The study area is the water resource plan amendment area, which includes North and South Stradbroke islands and the smaller islands between these and the mainland, including the inhabited smaller rocky islands—namely, Macleay, Russell, Karragarra, Lamb and Coochiemudlo islands. This assessment is largely a desktop study based on existing information, but incorporates some field observations, input from experts in specific areas and community representatives, and the professional experience and knowledge of the authors. This report reviews existing research and information on the southern Moreton Bay area with an emphasis on North Stradbroke Island, as it represents the largest and most regionally significant groundwater resource in southern Moreton Bay. The report provides an assessment of key waterrelated environmental features, their condition and their degree of dependence on groundwater. This report also assesses the condition and status of ecosystems within this region. In addition, the report identifies information gaps, uncertainties and potential impacts; reviews groundwater models that have been developed for North Stradbroke Island; and makes recommendations on monitoring and research needs.
Resumo:
The term Design Led Innovation is emerging as a fundamental business process, which is rapidly being adopted by large as well as small to medium sized firms. The value that design brings to an organisation is a different way of thinking, of framing situations and possibilities, doing things and tackling problems: essentially a cultural transformation of the way the firm undertakes its business. Being Design Led is increasingly being seen by business as a driver of company growth, allowing firms to provide a strong point of difference to its stakeholders. Achieving this Design Led process, requires strong leadership to enable the organisation to develop a clear vision for top line growth. Specifically, based on deep customer insights and expanded through customer and stakeholder engagements, the outcomes of which are then adopted by all aspects of the business. To achieve this goal, several tools and processes are available, which need to be linked to new organisational capabilities within a business transformation context. The Design Led Innovation Team focuses on embedding tools and processes within an organisation and matching this with design leadership qualities to enable companies to create breakthrough innovation and achieve sustained growth, through ultimately transforming their business model. As all information for these case studies was derived from publicly accessed data, this resource is not intended to be used as reference material, but rather is a learning tool for designers to begin to consider and explore businesses at a strategic level. It is not the results that are key, but rather the process and philosophies that were used to create these case studies and disseminate this way of thinking amongst the design community. It is this process of unpacking a business guided by the framework of Osterwalder’s Business Model Canvas* which provides an important tool for designers to gain a greater perspective of a company’s true innovation potential.