958 resultados para General combining ability
Resumo:
We investigate existing cloud storage schemes and identify limitations in each one based on the security services that they provide. We then propose a new cloud storage architecture that extends CloudProof of Popa et al. to provide availability assurance. This is accomplished by incorporating a proof of storage protocol. As a result, we obtain the first secure storage cloud computing scheme that furnishes all three properties of availability, fairness and freshness.
Resumo:
The prevalence of International New Ventures (INVs) has increased during the past twenty years. Nevertheless, to date there has been no general consensus within the literature on an explanation as to the rapid internationalisation of some firms. Do they follow a similar process to other firms that internationalise based on a more measured incremental sequential process of internationalisation. This paper proposes and tests an innovation diffusion model of the internationalisation of small firm INVs and others by drawing on key innovation diffusion models from the literature. The results of this analysis indicate that the synthesised model of export adoption is effective in explaining the internationalisation process of INVs and other firms within the Queensland Food and Beverage Industry. Significantly, the features of the original innovation diffusion models developed in the consumer behaviour literature, which had limited examination within the internationalisation literature, were confirmed. This includes the ability of firms, or specifically decision-makers, to skip stages based on previous experience.
Resumo:
This practice-led study explores different ways the subject of sustain-ability can be addressed within an Interactive Media Arts practice. The exploration encompasses three creative projects, Charmed, Distracted and e. Menura superba. Grounded in an ecological philosophy inspired by vegetarianism and the critical design philosophy of defuturing, the work shows how such a philosophical position can guide the redirection of practice. The concern for sustain-ability within my practice, and more generally the question of Interactive Media Arts and sustain-ability, I refer to as a problmatique. The objective of this study is not one of finding an answer or a truth to an instrumentally posed question, but to explore the complexities of the problmatique through a program of practice and intellectual investigation. The aim being to redirect my practice and to find a renewed raison dtre for practice through a process of opening up, encountering, and discovering otherwise unknown possibilities for practice. In the context of sustain-ability, this opening up of possibilities can be considered a form of futuring. A futuring I argue is only possible if the things we take for granted as integral aspects of our being, practices and life worlds, are revealed in ways that estrange them, rendering them visible in ways that allow questioning and change.
Resumo:
The benefits of applying tree-based methods to the purpose of modelling financial assets as opposed to linear factor analysis are increasingly being understood by market practitioners. Tree-based models such as CART (classification and regression trees) are particularly well suited to analysing stock market data which is noisy and often contains non-linear relationships and high-order interactions. CART was originally developed in the 1980s by medical researchers disheartened by the stringent assumptions applied by traditional regression analysis (Brieman et al. [1984]). In the intervening years, CART has been successfully applied to many areas of finance such as the classification of financial distress of firms (see Frydman, Altman and Kao [1985]), asset allocation (see Sorensen, Mezrich and Miller [1996]), equity style timing (see Kao and Shumaker [1999]) and stock selection (see Sorensen, Miller and Ooi [2000])...
Resumo:
In 2010, the State Library of Queensland (SLQ) donated their out-of-copyright Queensland images to Wikimedia Commons. One direct effect of publishing the collections at Wikimedia Commons is the ability of general audiences to participate and help the library in processing the images in the collection. This paper will discuss a project that explored user participation in the categorisation of the State Library of Queensland digital image collections. The outcomes of this project can be used to gain a better understanding of user participation that lead to improving access to library digital collections. Two techniques for data collection were used: documents analysis and interview. Document analysis was performed on the Wikimedia Commons monthly reports. Meanwhile, interview was used as the main data collection technique in this research. The data collected from document analysis was used to help the researchers to devise appropriate questions for interviews. The interviews were undertaken with participants who were divided into two groups: SLQ staff members and Wikimedians (users who participate in Wikimedia). The two sets of data collected from participants were analysed independently and compared. This method was useful for the researchers to understand the differences between the experiences of categorisation from both the librarians and the users perspectives. This paper will provide a discussion on the preliminary findings that have emerged from each group participant. This research provides preliminary information about the extent of user participation in the categorisation of SLQ collections in Wikimedia Commons that can be used by SLQ and other interested libraries in describing their digital content by their categorisations to improve user access to the collection in the future.
Resumo:
Exceeding the speed limit and driving too fast for the conditions are regularly cited as significant contributing factors in traffic crashes, particularly fatal and serious injury crashes. Despite an extensive body of research highlighting the relationship between increased vehicle speeds and crash risk and severity, speeding remains a pervasive behaviour on Australian roads. The development of effective countermeasures designed to reduce the prevalence of speeding behaviour requires that this behaviour is well understood. The primary aim of this program of research was to develop a better understanding of the influence of drivers perceptions and attitudes toward police speed enforcement on speeding behaviour. Study 1 employed focus group discussions with 39 licensed drivers to explore the influence of perceptions relating to specific characteristics of speed enforcement policies and practices on drivers attitudes towards speed enforcement. Three primary factors were identified as being most influential: site selection; visibility; and automaticity (i.e., whether the enforcement approach is automated/camera-based or manually operated). Perceptions regarding these enforcement characteristics were found to influence attitudes regarding the perceived legitimacy and transparency of speed enforcement. Moreover, misperceptions regarding speed enforcement policies and practices appeared to also have a substantial impact on attitudes toward speed enforcement, typically in a negative direction. These findings have important implications for road safety given that prior research has suggested that the effectiveness of speed enforcement approaches may be reduced if efforts are perceived by drivers as being illegitimate, such that they do little to encourage voluntary compliance. Study 1 also examined the impact of speed enforcement approaches varying in the degree of visibility and automaticity on self-reported willingness to comply with speed limits. These discussions suggested that all of the examined speed enforcement approaches (see Section 1.5 for more details) generally showed potential to reduce vehicle speeds and encourage compliance with posted speed limits. Nonetheless, participant responses suggested a greater willingness to comply with approaches operated in a highly visible manner, irrespective of the corresponding level of automaticity of the approach. While less visible approaches were typically associated with poorer rates of driver acceptance (e.g., perceived as sneaky and unfair), participants reported that such approaches would likely encourage long-term and network-wide impacts on their own speeding behaviour, as a function of the increased unpredictability of operations and increased direct (specific deterrence) and vicarious (general deterrence) experiences with punishment. Participants in Study 1 suggested that automated approaches, particularly when operated in a highly visible manner, do little to encourage compliance with speed limits except in the immediate vicinity of the enforcement location. While speed cameras have been criticised on such grounds in the past, such approaches can still have substantial road safety benefits if implemented in high-risk settings. Moreover, site-learning effects associated with automated approaches can also be argued to be a beneficial by-product of enforcement, such that behavioural modifications are achieved even in the absence of actual enforcement. Conversely, manually operated approaches were reported to be associated with more network-wide impacts on behaviour. In addition, the reported acceptance of such methods was high, due to the increased swiftness of punishment, ability for additional illegal driving behaviours to be policed and the salutary influence associated with increased face-to-face contact with authority. Study 2 involved a quantitative survey conducted with 718 licensed Queensland drivers from metropolitan and regional areas. The survey sought to further examine the influence of the visibility and automaticity of operations on self-reported likelihood and duration of compliance. Overall, the results from Study 2 corroborated those of Study 1. All examined approaches were again found to encourage compliance with speed limits, such that all approaches could be considered to be effective. Nonetheless, significantly greater self-reported likelihood and duration of compliance was associated with visibly operated approaches, irrespective of the corresponding automaticity of the approach. In addition, the impact of automaticity was influenced by visibility; such that significantly greater self-reported likelihood of compliance was associated with manually operated approaches, but only when they are operated in a less visible fashion. Conversely, manually operated approaches were associated with significantly greater durations of self-reported compliance, but only when they are operated in a highly visible manner. Taken together, the findings from Studies 1 and 2 suggest that enforcement efforts, irrespective of their visibility or automaticity, generally encourage compliance with speed limits. However, the duration of these effects on behaviour upon removal of the enforcement efforts remains questionable and represents an area where current speed enforcement practices could possibly be improved. Overall, it appears that identifying the optimal mix of enforcement operations, implementing them at a sufficient intensity and increasing the unpredictability of enforcement efforts (e.g., greater use of less visible approaches, random scheduling) are critical elements of success. Hierarchical multiple regression analyses were also performed in Study 2 to investigate the punishment-related and attitudinal constructs that influence self-reported frequency of speeding behaviour. The research was based on the theoretical framework of expanded deterrence theory, augmented with three particular attitudinal constructs. Specifically, previous research examining the influence of attitudes on speeding behaviour has typically focussed on attitudes toward speeding behaviour in general only. This research sought to more comprehensively explore the influence of attitudes by also individually measuring and analysing attitudes toward speed enforcement and attitudes toward the appropriateness of speed limits on speeding behaviour. Consistent with previous research, a number of classical and expanded deterrence theory variables were found to significantly predict self-reported frequency of speeding behaviour. Significantly greater speeding behaviour was typically reported by those participants who perceived punishment associated with speeding to be less certain, who reported more frequent use of punishment avoidance strategies and who reported greater direct experiences with punishment. A number of interesting differences in the significant predictors among males and females, as well as younger and older drivers, were reported. Specifically, classical deterrence theory variables appeared most influential on the speeding behaviour of males and younger drivers, while expanded deterrence theory constructs appeared more influential for females. These findings have important implications for the development and implementation of speeding countermeasures. Of the attitudinal factors, significantly greater self-reported frequency of speeding behaviour was reported among participants who held more favourable attitudes toward speeding and who perceived speed limits to be set inappropriately low. Disappointingly, attitudes toward speed enforcement were found to have little influence on reported speeding behaviour, over and above the other deterrence theory and attitudinal constructs. Indeed, the relationship between attitudes toward speed enforcement and self-reported speeding behaviour was completely accounted for by attitudes toward speeding. Nonetheless, the complexity of attitudes toward speed enforcement are not yet fully understood and future research should more comprehensively explore the measurement of this construct. Finally, given the wealth of evidence (both in general and emerging from this program of research) highlighting the association between punishment avoidance and speeding behaviour, Study 2 also sought to investigate the factors that influence the self-reported propensity to use punishment avoidance strategies. A standard multiple regression analysis was conducted for exploratory purposes only. The results revealed that punishment-related and attitudinal factors significantly predicted approximately one fifth of the variance in the dependent variable. The perceived ability to avoid punishment, vicarious punishment experience, vicarious punishment avoidance and attitudes toward speeding were all significant predictors. Future research should examine these relationships more thoroughly and identify additional influential factors. In summary, the current program of research has a number of implications for road safety and speed enforcement policy and practice decision-making. The research highlights a number of potential avenues for the improvement of public education regarding enforcement efforts and provides a number of insights into punishment avoidance behaviours. In addition, the research adds strength to the argument that enforcement approaches should not only demonstrate effectiveness in achieving key road safety objectives, such as reduced vehicle speeds and associated crashes, but also strive to be transparent and legitimate, such that voluntary compliance is encouraged. A number of potential strategies are discussed (e.g., point-to-point speed cameras, intelligent speed adaptation. The correct mix and intensity of enforcement approaches appears critical for achieving optimum effectiveness from enforcement efforts, as well as enhancements in the unpredictability of operations and swiftness of punishment. Achievement of these goals should increase both the general and specific deterrent effects associated with enforcement through an increased perceived risk of detection and a more balanced exposure to punishment and punishment avoidance experiences.
Resumo:
Owing to the successful use of non-invasive vibration analysis to monitor the progression of dental implant healing and stabilization, it is now being considered as a method to monitor femoral implants in transfemoral amputees. This study uses composite femur-implant physical models to investigate the ability of modal analysis to detect changes at the interface between the implant and bone simulating those that occur during osseointegration. Using electromagnetic shaker excitation, differences were detected in the resonant frequencies and mode shapes of the model when the implant fit in the bone was altered to simulate the two interface cases considered: firm and loose fixation. The study showed that it is beneficial to examine higher resonant frequencies and their mode shapes (rather than the fundamental frequency only) when assessing fixation. The influence of the model boundary conditions on the modal parameters was also demonstrated. Further work is required to more accurately model the mechanical changes occurring at the bone-implant interface in vivo, as well as further refinement of the model boundary conditions to appropriately represent the in vivo conditions. Nevertheless, the ability to detect changes in the model dynamic properties demonstrates the potential of modal analysis in this application and warrants further investigation.
Resumo:
PURPOSE: To examine the basis of previous findings of an association between indices of driving safety and visual motion sensitivity and to examine whether this association could be explained by low-level changes in visual function. METHODS: 36 visually normal participants (aged 19 80 years), completed a battery of standard vision tests including visual acuity, contrast sensitivity and automated visual fields. and two tests of motion perception including sensitivity for movement of a drifting Gabor stimulus, and sensitivity for displacement in a random-dot kinematogram (Dmin). Participants also completed a hazard perception test (HPT) which measured participants response times to hazards embedded in video recordings of real world driving which has been shown to be linked to crash risk. RESULTS: Dmin for the random-dot stimulus ranged from -0.88 to -0.12 log minutes of arc, and the minimum drift rate for the Gabor stimulus ranged from 0.01 to 0.35 cycles per second. Both measures of motion sensitivity significantly predicted response times on the HPT. In addition, while the relationship involving the HPT and motion sensitivity for the random-dot kinematogram was partially explained by the other visual function measures, the relationship with sensitivity for detection of the drifting Gabor stimulus remained significant even after controlling for these variables. CONCLUSION: These findings suggest that motion perception plays an important role in the visual perception of driving-relevant hazards independent of other areas of visual function and should be further explored as a predictive test of driving safety. Future research should explore the causes of reduced motion perception in order to develop better interventions to improve road safety.
Resumo:
Citizen Science projects are initiatives in which members of the general public participate in scientific research projects and perform or manage research-related tasks such as data collection and/or data annotation. Citizen Science is technologically possible and scientifically significant. However, as the gathered information is from the crowd, the data quality is always hard to manage. There are many ways to manage data quality, and reputation management is one of the common approaches. In recent year, many research teams have deployed many audio or image sensors in natural environment in order to monitor the status of animals or plants. The collected data will be analysed by ecologists. However, as the amount of collected data is exceedingly huge and the number of ecologists is very limited, it is impossible for scientists to manually analyse all these data. The functions of existing automated tools to process the data are still very limited and the results are still not very accurate. Therefore, researchers have turned to recruiting general citizens who are interested in helping scientific research to do the pre-processing tasks such as species tagging. Although research teams can save time and money by recruiting general citizens to volunteer their time and skills to help data analysis, the reliability of contributed data varies a lot. Therefore, this research aims to investigate techniques to enhance the reliability of data contributed by general citizens in scientific research projects especially for acoustic sensing projects. In particular, we aim to investigate how to use reputation management to enhance data reliability. Reputation systems have been used to solve the uncertainty and improve data quality in many marketing and E-Commerce domains. The commercial organizations which have chosen to embrace the reputation management and implement the technology have gained many benefits. Data quality issues are significant to the domain of Citizen Science due to the quantity and diversity of people and devices involved. However, research on reputation management in this area is relatively new. We therefore start our investigation by examining existing reputation systems in different domains. Then we design novel reputation management approaches for Citizen Science projects to categorise participants and data. We have investigated some critical elements which may influence data reliability in Citizen Science projects. These elements include personal information such as location and education and performance information such as the ability to recognise certain bird calls. The designed reputation framework is evaluated by a series of experiments involving many participants for collecting and interpreting data, in particular, environmental acoustic data. Our research in exploring the advantages of reputation management in Citizen Science (or crowdsourcing in general) will help increase awareness among organizations that are unacquainted with its potential benefits.
Resumo:
This study contributes to the understanding of the contribution of financial reserves to sustaining nonprofit organisations. Recognising the limited recent Australian research in the area of nonprofit financial vulnerability, it specifically examines financial reserves held by signatories to the Code of Conduct of the Australian Council for International Development (ACFID) for the years 2006 to 2010. As this period includes the Global Financial Crisis, it presents a unique opportunity to observe the role of savings in a period of heightened financial threats to sustainability. The need for nonprofit entities to maintain reserves, while appearing intuitively evident, is neither unanimously accepted nor supported by established theoretic constructs. Some early frameworks attempt to explain the savings behaviour of nonprofit organisations and its role in organisational sustainability. Where researchers have considered the issue, its treatment has usually been either purely descriptive or alternatively, peripheral to a broader attempt to predict financial vulnerability. Given the importance of nonprofit entities to civil society, the sustainability of these organisations during times of economic contraction, such as the recent Global Financial Crisis, is a significant issue. Widespread failure of nonprofits, or even the perception of failure, will directly affect, not only those individuals who access their public goods and services, but would also have impacts on public confidence in both government and the sectors ability to manage and achieve their purpose. This study attempts to shine a light on the paradox inherent in considering nonprofit savings. On the one hand, a public prevailing view is that nonprofit organisations should not hoard and indeed, should spend all of their funds on the direct achievement of their purposes. Against this, is the commonsense need for a financial buffer if only to allow for the day to day contingencies of pay rises and cost increases. At the entity level, the extent of reserves accumulated (or not) is an important consideration for Management Boards. The general public are also interested in knowing the level of funds held by nonprofits as a measure of both their commitment to purpose and as an indicator of their effectiveness. There is a need to communicate the level and prevalence of reserve holdings, balancing the prudent hedging of uncertainty against a sense of resource hoarding in the mind of donors. Finally, funders (especially governments) are interested in knowing the appropriate level of reserves to facilitate the ongoing sustainability of the sector. This is particularly so where organisations are involved in the provision of essential public goods and services. At a scholarly level, the study seeks to provide a rationale for this behaviour within the context of appropriate theory. At a practical level, the study seeks to give an indication of the drivers for savings, the actual levels of reserves held within the sector studied, as well as an indication as to whether the presence of reserves did mitigate the effects of financial turmoil during the Global Financial Crisis. The argument is not whether there is a need to ensure sustainability of nonprofits, but rather how it is to be done and whether the holding of reserves (net assets) is an essential element is achieving this. While the study offers no simple answers, it does appear that the organisations studied present as two groups, the savers who build reserves and keep money in the bank and spender-delivers who put their resources on the ground. To progress an understanding of this dichotomy, the study suggests a need to move from its current approach to one which needs to more closely explore accounts based empirical donor attitude and nonprofit Management Board strategy.
Resumo:
The standard approach to tax compliance applies the economics-of-crime methodology pioneered by Becker (1968): in its first application, due to Allingham and Sandmo (1972) it models the behaviour of agents as a decision involving a choice of the extent of their income to report to tax authorities, given a certain institutional environment, represented by parameters such as the probability of detection and penalties in the event the agent is caught. While this basic framework yields important insights on tax compliance behavior, it has some critical limitations. Specifically, it indicates a level of compliance that is significantly below what is observed in the data. This thesis revisits the original framework with a view towards addressing this issue, and examining the political economy implications of tax evasion for progressivity in the tax structure. The approach followed involves building a macroeconomic, dynamic equilibrium model for the purpose of examining these issues, by using a step-wise model building procedure starting with some very simple variations of the basic Allingham and Sandmo construct, which are eventually integrated to a dynamic general equilibrium overlapping generations framework with heterogeneous agents. One of the variations involves incorporating the Allingham and Sandmo construct into a two-period model of a small open economy of the type originally attributed to Fisher (1930). A further variation of this simple construct involves allowing agents to initially decide whether to evade taxes or not. In the event they decide to evade, the agents then have to decide the extent of income or wealth they wish to under-report. We find that the evade or not assumption has strikingly different and more realistic implications for the extent of evasion, and demonstrate that it is a more appropriate modeling strategy in the context of macroeconomic models, which are essentially dynamic in nature, and involve consumption smoothing across time and across various states of nature. Specifically, since deciding to undertake tax evasion impacts on the consumption smoothing ability of the agent by creating two states of nature in which the agent is caught or not caught, there is a possibility that their utility under certainty, when they choose not to evade, is higher than the expected utility obtained when they choose to evade. Furthermore, the simple two-period model incorporating an evade or not choice can be used to demonstrate some strikingly different political economy implications relative to its Allingham and Sandmo counterpart. In variations of the two models that allow for voting on the tax parameter, we find that agents typically choose to vote for a high degree of progressivity by choosing the highest available tax rate from the menu of choices available to them. There is, however, a small range of inequality levels for which agents in the evade or not model vote for a relatively low value of the tax rate. The final steps in the model building procedure involve grafting the two-period models with a political economy choice into a dynamic overlapping generations setting with more general, non-linear tax schedules and a cost-of evasion function that is increasing in the extent of evasion. Results based on numerical simulations of these models show further improvement in the models ability to match empirically plausible levels of tax evasion. In addition, the differences between the political economy implications of the evade or not version of the model and its Allingham and Sandmo counterpart are now very striking; there is now a large range of values of the inequality parameter for which agents in the evade or not model vote for a low degree of progressivity. This is because, in the evade or not version of the model, low values of the tax rate encourages a large number of agents to choose the not-evade option, so that the redistributive mechanism is more efficient relative to the situations in which tax rates are high. Some further implications of the models of this thesis relate to whether variations in the level of inequality, and parameters such as the probability of detection and penalties for tax evasion matter for the political economy results. We find that (i) the political economy outcomes for the tax rate are quite insensitive to changes in inequality, and (ii) the voting outcomes change in non-monotonic ways in response to changes in the probability of detection and penalty rates. Specifically, the model suggests that changes in inequality should not matter, although the political outcome for the tax rate for a given level of inequality is conditional on whether there is a large or small or large extent of evasion in the economy. We conclude that further theoretical research into macroeconomic models of tax evasion is required to identify the structural relationships underpinning the link between inequality and redistribution in the presence of tax evasion. The models of this thesis provide a necessary first step in that direction.
Resumo:
Search technologies are critical to enable clinical sta to rapidly and e ectively access patient information contained in free-text medical records. Medical search is challenging as terms in the query are often general but those in rel- evant documents are very speci c, leading to granularity mismatch. In this paper we propose to tackle granularity mismatch by exploiting subsumption relationships de ned in formal medical domain knowledge resources. In symbolic reasoning, a subsumption (or `is-a') relationship is a parent-child rela- tionship where one concept is a subset of another concept. Subsumed concepts are included in the retrieval function. In addition, we investigate a number of initial methods for combining weights of query concepts and those of subsumed concepts. Subsumption relationships were found to provide strong indication of relevant information; their inclusion in retrieval functions yields performance improvements. This result motivates the development of formal models of rela- tionships between medical concepts for retrieval purposes.
Resumo:
This paper outlines a novel approach for modelling semantic relationships within medical documents. Medical terminologies contain a rich source of semantic information critical to a number of techniques in medical informatics, including medical information retrieval. Recent research suggests that corpus-driven approaches are effective at automatically capturing semantic similarities between medical concepts, thus making them an attractive option for accessing semantic information. Most previous corpus-driven methods only considered syntagmatic associations. In this paper, we adapt a recent approach that explicitly models both syntagmatic and paradigmatic associations. We show that the implicit similarity between certain medical concepts can only be modelled using paradigmatic associations. In addition, the inclusion of both types of associations overcomes the sensitivity to the training corpus experienced by previous approaches, making our method both more effective and more robust. This finding may have implications for researchers in the area of medical information retrieval.