862 resultados para publicly verifiable
Resumo:
This qualitative two-site case study examined the capacity building practices that Children’s Services Councils (CSCs), independent units of local government, provide to nonprofit organizations (NPOs) contracted to deliver human services. The contracting literature is replete with recommendations for government to provide capacity building to contracted NPOs, yet there is a dearth of scholarship on this topic. The study’s purpose was to increase the understanding of capacity building provided in a local government contracting setting. Data collection consisted primarily of in-depth interviews and focus groups with 73 staff from two CSCs and 28 contracted NPOs. Interview data were supplemented by participant observation and review of secondary data. The study analyzed capacity building needs, practices, influencing factors, and outcomes. The study identified NPO capacity building needs in: documentation and reporting, financial management, program monitoring and evaluation, participant recruitment and retention, and program quality. Additionally, sixteen different types of CSC capacity building practices were identified. Results indicated that three major factors impacted CSC capacity building: CSC capacity building goals, the relationship between the CSC and NPOs, and the level of NPO participation. Study results also provided insight into the dynamics of the CSC capacity building process, including unique problems, challenges, and opportunities as well as necessary resources. The results indicated that the CSCs’ relational contracting approach facilitated CSC capacity building and that CSC contract managers were central players in the process. The study provided evidence that local government agencies can serve as effective builders of NPO capacity. Additionally, results indicated that much of what is known about capacity building can be applied in this previously unstudied capacity building setting. Finally, the study laid the groundwork for future development of a model for capacity building in a local government contracting setting.
THE COSTS OF RAISING EQUITY RATIO FOR BANKS Evidence from publicly listed banks operating in Finland
Resumo:
The solvency rate of banks differs from the other corporations. The equity rate of a bank is lower than it is in corporations of other field of business. However, functional banking industry has huge impact on the whole society. The equity rate of a bank needs to be higher because that makes the banking industry more stable as the probability of the banks going under will decrease. If a bank goes belly up, the government will be compensating the deposits since it has granted the bank’s depositors a deposit insurance. This means that the payment comes from the tax payers in the last resort. Economic conversation has long concentrated on the costs of raising equity ratio. It has been a common belief that raising equity ratio also increases the banks’ funding costs in the same phase and these costs will be redistributed to the banks customers as higher service charges. Regardless of the common belief, the actual reaction of the funding costs to the higher equity ratio has been studied only a little in Europe and no study has been constructed in Finland. Before it can be calculated whether the higher stability of the banking industry that is caused by the raise in equity levels compensates the extra costs in funding costs, it must be calculated how much the actual increase in the funding costs is. Currently the banking industry is controlled by complex and heavy regulation. To maintain such a complex system inflicts major costs in itself. This research leans on the Modigliani and Miller theory, which shows that the finance structure of a firm is irrelevant to their funding costs. In addition, this research follows the calculations of Miller, Yang ja Marcheggianon (2012) and Vale (2011) where they calculate the funding costs after the doubling of specific banks’ equity ratios. The Finnish banks studied in this research are Nordea and Danske Bank because they are the two largest banks operating in Finland and they both also have the right company form to able the calculations. To calculate the costs of halving their leverages this study used the Capital Asset Pricing Model. The halving of the leverage of Danske Bank raised its funding costs for 16—257 basis points depending on the method of assessment. For Nordea the increase in funding costs was 11—186 basis points when its leverage was halved. On the behalf of the results found in this study it can be said that the doubling of an equity ratio does not increase the funding costs of a bank one by one. Actually the increase is quite modest. More solvent banks would increase the stability of the banking industry enormously while the increase in funding costs is low. If the costs of bank regulation exceeds the increase in funding costs after the higher equity ratio, it can be thought that this is the better way of stabilizing the banking industry rather than heavy regulation.
THE COSTS OF RAISING EQUITY RATIO FOR BANKS Evidence from publicly listed banks operating in Finland
Resumo:
The solvency rate of banks differs from the other corporations. The equity rate of a bank is lower than it is in corporations of other field of business. However, functional banking industry has huge impact on the whole society. The equity rate of a bank needs to be higher because that makes the banking industry more stable as the probability of the banks going under will decrease. If a bank goes belly up, the government will be compensating the deposits since it has granted the bank’s depositors a deposit insurance. This means that the payment comes from the tax payers in the last resort. Economic conversation has long concentrated on the costs of raising equity ratio. It has been a common belief that raising equity ratio also increases the banks’ funding costs in the same phase and these costs will be redistributed to the banks customers as higher service charges. Regardless of the common belief, the actual reaction of the funding costs to the higher equity ratio has been studied only a little in Europe and no study has been constructed in Finland. Before it can be calculated whether the higher stability of the banking industry that is caused by the raise in equity levels compensates the extra costs in funding costs, it must be calculated how much the actual increase in the funding costs is. Currently the banking industry is controlled by complex and heavy regulation. To maintain such a complex system inflicts major costs in itself. This research leans on the Modigliani and Miller theory, which shows that the finance structure of a firm is irrelevant to their funding costs. In addition, this research follows the calculations of Miller, Yang ja Marcheggianon (2012) and Vale (2011) where they calculate the funding costs after the doubling of specific banks’ equity ratios. The Finnish banks studied in this research are Nordea and Danske Bank because they are the two largest banks operating in Finland and they both also have the right company form to able the calculations. To calculate the costs of halving their leverages this study used the Capital Asset Pricing Model. The halving of the leverage of Danske Bank raised its funding costs for 16—257 basis points depending on the method of assessment. For Nordea the increase in funding costs was 11—186 basis points when its leverage was halved. On the behalf of the results found in this study it can be said that the doubling of an equity ratio does not increase the funding costs of a bank one by one. Actually the increase is quite modest. More solvent banks would increase the stability of the banking industry enormously while the increase in funding costs is low. If the costs of bank regulation exceeds the increase in funding costs after the higher equity ratio, it can be thought that this is the better way of stabilizing the banking industry rather than heavy regulation.
Resumo:
We investigate if investors may benefit from using the accruals quality measure to assess the level of earnings management exercised by firms when preparing their accounting statements. More earnings management is expected to be associated with high information asymmetry among stock market participants because it makes earnings information less precise, thus providing an information advantage to informed investors relative to liquidity traders. Our results based on a sample of European publicly traded firms are consistent with a positive association between earnings management and information asymmetry. However, given some previous studies suggesting that accruals based measures may be noisy indicators of earnings management we further develop and test a method to enhance the performance of accruals quality in detecting earnings management.
Resumo:
We investigate if investors may benefit from using the accruals quality measure to assess the level of earnings management exercised by firms when preparing their accounting statements. More earnings management is expected to be associated with high information asymmetry among stock market participants because it makes earnings information less precise, thus providing an information advantage to informed investors relative to liquidity traders. Our results based on a sample of European publicly traded firms are consistent with a positive association between earnings management and information asymmetry. However, given some previous studies suggesting that accruals based measures may be noisy indicators of earnings management we further develop and test a method to enhance the performance of accruals quality in detecting earnings management
Resumo:
Party 25 involved the conception and public launch of a radically new form of political party during that year’s Australian general election. The entire project was also intentioned as a conceptual artwork. Party 25 avoided conventional party-political approaches and was neither a protest group nor an advocacy organisation, but rather a new form of political association that confronted what we understood as the debilitating limits and impotence of contemporary parliamentary democracies in transitioning our societies towards ecological sustainability.----- Party 25 was based on responding to one fundamental question which all of its policies served - “how does humanity get to the 25th century?” By basing itself on a dramatically long-term approach uncommon within conventional politics it raised the proposition that humanity does not have an assured future. Party25 therefore shaped its agendas around the idea that any future now lies in human hands and so how humanity treats the ecologies on which it depends innately determines the quality of the inseparable relationship between its being, and the being of the biophysical world.----- The project was conceived through a number of discussion papers, workshops and creative works and was launched publicly at the Judith Wright Centre Brisbane accompanied by a full length showing of evocative imagery, text and sound, a series of speeches and the launch of a succinct web presence. Through the website and this party launch a community of interested participants and creative practitioners was sought who then would form the basis of a nascent community of change.
Resumo:
The research presented in this thesis addresses inherent problems in signaturebased intrusion detection systems (IDSs) operating in heterogeneous environments. The research proposes a solution to address the difficulties associated with multistep attack scenario specification and detection for such environments. The research has focused on two distinct problems: the representation of events derived from heterogeneous sources and multi-step attack specification and detection. The first part of the research investigates the application of an event abstraction model to event logs collected from a heterogeneous environment. The event abstraction model comprises a hierarchy of events derived from different log sources such as system audit data, application logs, captured network traffic, and intrusion detection system alerts. Unlike existing event abstraction models where low-level information may be discarded during the abstraction process, the event abstraction model presented in this work preserves all low-level information as well as providing high-level information in the form of abstract events. The event abstraction model presented in this work was designed independently of any particular IDS and thus may be used by any IDS, intrusion forensic tools, or monitoring tools. The second part of the research investigates the use of unification for multi-step attack scenario specification and detection. Multi-step attack scenarios are hard to specify and detect as they often involve the correlation of events from multiple sources which may be affected by time uncertainty. The unification algorithm provides a simple and straightforward scenario matching mechanism by using variable instantiation where variables represent events as defined in the event abstraction model. The third part of the research looks into the solution to address time uncertainty. Clock synchronisation is crucial for detecting multi-step attack scenarios which involve logs from multiple hosts. Issues involving time uncertainty have been largely neglected by intrusion detection research. The system presented in this research introduces two techniques for addressing time uncertainty issues: clock skew compensation and clock drift modelling using linear regression. An off-line IDS prototype for detecting multi-step attacks has been implemented. The prototype comprises two modules: implementation of the abstract event system architecture (AESA) and of the scenario detection module. The scenario detection module implements our signature language developed based on the Python programming language syntax and the unification-based scenario detection engine. The prototype has been evaluated using a publicly available dataset of real attack traffic and event logs and a synthetic dataset. The distinct features of the public dataset are the fact that it contains multi-step attacks which involve multiple hosts with clock skew and clock drift. These features allow us to demonstrate the application and the advantages of the contributions of this research. All instances of multi-step attacks in the dataset have been correctly identified even though there exists a significant clock skew and drift in the dataset. Future work identified by this research would be to develop a refined unification algorithm suitable for processing streams of events to enable an on-line detection. In terms of time uncertainty, identified future work would be to develop mechanisms which allows automatic clock skew and clock drift identification and correction. The immediate application of the research presented in this thesis is the framework of an off-line IDS which processes events from heterogeneous sources using abstraction and which can detect multi-step attack scenarios which may involve time uncertainty.
Resumo:
Since 1995 the buildingSMART International Alliance for Interoperability (buildingSMART)has developed a robust standard called the Industry Foundation Classes (IFC). IFC is an object oriented data model with related file format that has facilitated the efficient exchange of data in the development of building information models (BIM). The Cooperative Research Centre for Construction Innovation has contributed to the international effort in the development of the IFC standard and specifically the reinforced concrete part of the latest IFC 2x3 release. Industry Foundation Classes have been endorsed by the International Standards Organisation as a Publicly Available Specification (PAS) under the ISO label ISO/PAS 16739. For more details, go to http://www.tc184- sc4.org/About_TC184-SC4/About_SC4_Standards/ The current IFC model covers the building itself to a useful level of detail. The next stage of development for the IFC standard is where the building meets the ground (terrain) and with civil and external works like pavements, retaining walls, bridges, tunnels etc. With the current focus in Australia on infrastructure projects over the next 20 years a logical extension to this standard was in the area of site and civil works. This proposal recognises that there is an existing body of work on the specification of road representation data. In particular, LandXML is recognised as also is TransXML in the broader context of transportation and CityGML in the common interfacing of city maps, buildings and roads. Examination of interfaces between IFC and these specifications is therefore within the scope of this project. That such interfaces can be developed has already been demonstrated in principle within the IFC for Geographic Information Systems (GIS) project. National road standards that are already in use should be carefully analysed and contacts established in order to gain from this knowledge. The Object Catalogue for the Road Transport Sector (OKSTRA) should be noted as an example. It is also noted that buildingSMART Norway has submitted a proposal
Resumo:
The Google Online Marketing Challenge is an ongoing collaboration between Google and academics, to give students experiential learning. The Challenge gives student teams US$200 in AdWords, Google’s flagship advertising product, to develop online marketing campaigns for actual businesses. The end result is an engaging in-class exercise that provides students and professors with an exciting and pedagogically rigorous competition. Results from surveys at the end of the Challenge reveal positive appraisals from the three—students, businesses, and professors—main constituents; general agreement between students and instructors regarding learning outcomes; and a few points of difference between students and instructors. In addition to describing the Challenge and its outcomes, this article reviews the postparticipation questionnaires and subsequent datasets. The questionnaires and results are publicly available, and this article invites educators to mine the datasets, share their results, and offer suggestions for future iterations of the Challenge.
Resumo:
Introduction: The demand for emergency health services (EHS), both in the prehospital (ambulance) and hospital (emergency departments) settings, is growing rapidly in Australia. Broader health system changes have reduced available health infrastructure, particularly hospital beds, resulting in reduced access to and congestion of the EHS as demonstrated by longer waiting times and ambulance “ramping”. Ambulance ramping occurring when patients have a prolonged wait on the emergency vehicle due to the unavailability of hospital beds. This presentation will outline the trends in EHS demand in Queensland compared with the rest of Australia and factors that appear to be contributing to the growth in demand. Methods: Secondary analysis was conducted using data from publicly available sources. Data from the Queensland Ambulance Service and Queensland Health Emergency Department Information System (EDIS) also were analyzed. Results: The demand for ambulance services and emergency departments has been increasing at 8% and 4% per year over the last decade, respectively; while accessible hospital beds have reduced by almost 10% contributing to the emergency department congestion and possibly contributing to the prehospital demand. While the increase in the proportion of the elderly population seems to explain a great deal of the demand for EHS, other factors also influence this growth including patient characteristics, institutional and societal factors, economic, EHS arrangements, and clinical factors. Conclusions: Overcrowding of facilities that provide EHS are causing considerable community concern. This overcrowding is caused by the growing demand and reduced access. The causes of this growing demand are complex, and require further detailed analysis in order to quantify and qualify these causes in order to provide a resilient foundation of evidence for future policy direction.
Resumo:
This report presents findings from the largest survey of aspiring creatives who work or intend to work in the digital content industries ever undertaken in Australia. Survey respondents included those with aspirations to work in the publicly-supported, less commercial end of the Creative Industries spectrum as well as those with aspirations to work in the digital content industries. The survey gathered rich data on their characteristics, skills and attributes, barriers to employment, workforce mobility, career intentions, professional development, mentors and industry supports, and participation in communities of practice. The survey sought to determine if aspiring creatives have the necessary skills and attributes to work effectively in the digital content industries. This task also involved finding out how they develop their skills and attributes, and what they need to develop them further.
Resumo:
One of the major challenges facing a present day game development company is the removal of bugs from such complex virtual environments. This work presents an approach for measuring the correctness of synthetic scenes generated by a rendering system of a 3D application, such as a computer game. Our approach builds a database of labelled point clouds representing the spatiotemporal colour distribution for the objects present in a sequence of bug-free frames. This is done by converting the position that the pixels take over time into the 3D equivalent points with associated colours. Once the space of labelled points is built, each new image produced from the same game by any rendering system can be analysed by measuring its visual inconsistency in terms of distance from the database. Objects within the scene can be relocated (manually or by the application engine); yet the algorithm is able to perform the image analysis in terms of the 3D structure and colour distribution of samples on the surface of the object. We applied our framework to the publicly available game RacingGame developed for Microsoft(R) Xna(R). Preliminary results show how this approach can be used to detect a variety of visual artifacts generated by the rendering system in a professional quality game engine.
Resumo:
The Longitudinal Study of Australian Children (LSAC) is a major national study examining the lives of Australian children, using a cross-sequential cohort design and data from parents, children, and teachers for 5,107 infants (3–19 months) and 4,983 children (4–5 years). Its data are publicly accessible and are used by researchers from many disciplinary backgrounds. It contains multiple measures of children’s developmental outcomes as well as a broad range of information on the contexts of their lives. This paper reports on the development of summary outcome indices of child development using the LSAC data. The indices were developed to fill the need for indicators suitable for use by diverse data users in order to guide government policy and interventions which support young children’s optimal development. The concepts underpinning the indices and the methods of their development are presented. Two outcome indices (infant and child) were developed, each consisting of three domains—health and physical development, social and emotional functioning, and learning competency. A total of 16 measures are used to make up these three domains in the Outcome Index for the Child Cohort and six measures for the Infant Cohort. These measures are described and evidence supporting the structure of the domains and their underlying latent constructs is provided for both cohorts. The factorial structure of the Outcome Index is adequate for both cohorts, but was stronger for the child than infant cohort. It is concluded that the LSAC Outcome Index is a parsimonious measure representing the major components of development which is suitable for non-specialist data users. A companion paper (Sanson et al. 2010) presents evidence of the validity of the Index.
Resumo:
The critical impact of innovation on national and the global economies has been discussed at length in the literature. Economic development requires the diffusion of innovations into markets. It has long been recognised that economic growth and development depends upon a constant stream of innovations. Governments have been keenly aware of the need to ensure this flow does not dry to a trickle and have introduced many and varied industry policies and interventions to assist in seeding, supporting and diffusing innovations. In Australia, as in many countries, Government support for the transfer of knowledge especially from publicly funded research has resulted in the creation of knowledge exchange intermediaries. These intermediaries are themselves service organisations, seeking innovative service offerings for their markets. The choice for most intermediaries is generally a dichotomous one, between market-pull and technology-push knowledge exchange programmes. In this article, we undertake a case analysis of one such innovative intermediary and its flagship programme. We then compare this case with other successful intermediaries in Europe. We put forward a research proposition that the design of intermediary programmes must match the service type they offer. That is, market-pull programmes require market-pull design, in close collaboration with industry, whereas technology programmes can be problem-solving innovations where demand is latent. The discussion reflects the need for an evolution in knowledge transfer policies and programmes beyond the first generation ushered in with the US Bayh-Dole Act (1980) and Stevenson-Wydler Act (1984). The data analysed is a case study comparison of market-pull and technology-push programmes, focusing on primary and secondary socio-economic benefits (using both Australian and international comparisons).