842 resultados para Canadian Financial System
Resumo:
The selection criteria for contractor pre-qualification are characterized by the co-existence of both quantitative and qualitative data. The qualitative data is non-linear, uncertain and imprecise. An ideal decision support system for contractor pre-qualification should have the ability of handling both quantitative and qualitative data, and of mapping the complicated nonlinear relationship of the selection criteria, such that rational and consistent decisions can be made. In this research paper, an artificial neural network model was developed to assist public clients identifying suitable contractors for tendering. The pre-qualification criteria (variables) were identified for the model. One hundred and twelve real pre-qualification cases were collected from civil engineering projects in Hong Kong, and eighty-eight hypothetical pre-qualification cases were also generated according to the “If-then” rules used by professionals in the pre-qualification process. The results of the analysis totally comply with current practice (public developers in Hong Kong). Each pre-qualification case consisted of input ratings for candidate contractors’ attributes and their corresponding pre-qualification decisions. The training of the neural network model was accomplished by using the developed program, in which a conjugate gradient descent algorithm was incorporated for improving the learning performance of the network. Cross-validation was applied to estimate the generalization errors based on the “re-sampling” of training pairs. The case studies show that the artificial neural network model is suitable for mapping the complicated nonlinear relationship between contractors’ attributes and their corresponding pre-qualification (disqualification) decisions. The artificial neural network model can be concluded as an ideal alternative for performing the contractor pre-qualification task.
Resumo:
Network-based Intrusion Detection Systems (NIDSs) analyse network traffic to detect instances of malicious activity. Typically, this is only possible when the network traffic is accessible for analysis. With the growing use of Virtual Private Networks (VPNs) that encrypt network traffic, the NIDS can no longer access this crucial audit data. In this paper, we present an implementation and evaluation of our approach proposed in Goh et al. (2009). It is based on Shamir's secret-sharing scheme and allows a NIDS to function normally in a VPN without any modifications and without compromising the confidentiality afforded by the VPN.
Resumo:
Homelessness is a complex problem that manifests in all societies. This intractable and ‘wicked’ issue resists single-agency solutions and its resolution and requires a large, on-going investment of financial and professional resources that few organisations can sustain. This paper adopts a social innovation framework to examine government and community sector responses to homelessness. While recent evaluations and policy prescriptions have suggested better integrated and more co-ordinated service delivery models for addressing homelessness, there is little understanding of the innovation framework in which alternative service system paradigms emerge. A framework that identifies/distils and explains different innovation levels is put forward. The framework highlights that while government may lead strategic level innovations, community organisations are active in developing innovation at the service and client level. Moreover, community organisations may be unaware of the innovative capacity that resides in their creative responses to resolving social crisis and marginalisation through being without shelter.
Resumo:
The two longitudinal case studies that make up this dissertation sought to explain and predict the relationship between usability and clinician acceptance of a health information system. The overall aim of the research study was to determine what role usability plays in the acceptance or rejection of systems used by clinicians in a healthcare context. The focus was on the end users (the clinicians) rather than the views of the system designers and managers responsible for implementation and the clients of the clinicians. A mixed methods approach was adopted that drew on both qualitative and quantitative research methods. This study followed the implementation of a community health information system from early beginnings to its established practice. Users were drawn from different health service departments with distinctly different organisational cultures and attitudes to information and communication technology used in this context. This study provided evidence that a usability analysis in this context would not necessarily be valid when the users have prior reservations on acceptance. Investigation was made on the initial training and post-implementation support together with a study on the nature of the clinicians to determine factors that may influence their attitude. This research identified that acceptance of a system is not necessarily a measure of its quality, capability and usability, is influenced by the user’s attitude which is determined by outside factors, and the nature and quality of training. The need to recognise the limitations of the current methodologies for analysing usability and acceptance was explored to lay the foundations for further research.
Resumo:
The paper examines the decision by Australian Real Estate Trusts (A-REITs) to issue seasoned equity offerings from 2000 - 2008 and stock market reaction to the offerings. The findings reveal that highly leveraged A-REITs with variable earnings are less likely to issue seasoned equity offerings. Inconsistent results for structure and type of properties held by the A-REIT do not allow for inference to be drawn. Similar to previous studies of seasoned equity offerings, we find a significant negative abnormal return associated with their announcement and no evidence of excessive leakage of information. Furthermore, market reaction differences to announcements of SEOs for the pre-global financial crisis (GFC) (2000-2006) and GFC eras (2007-2008) are noted with GFC era shareholders incurring larger abnormal return losses at 1.13% in comparison to the pre-GFC era shareholder loss of 0.34% on the SEO announcement day. Cross-sectional regressions show that the issued amount, leverage and profitability are significant factors affecting abnormal returns. Growth opportunities, tangibility, operating risk, size of A-REIT and other variables capturing A-REIT structure and property types held do not have an impact on abnormal returns
Resumo:
This article surveys literature bearing on the issue of parental liability and responsibility for the crimes of young offenders, with a particular focus on comparing different approaches to dealing with the issue in Australia and Canada. This comparative analysis of Australian and Canadian legislative and policy approaches is situated within a broader discussion of arguments about the “punitive turn” in youth justice, responsibilisation, and cross-jurisdictional criminal justice policy transfer and convergence. Our findings suggest that there are significant differences in the manner and extent to which Australia and Canada have invoked parental responsibility laws and policies as part of the solution to dealing with youth crime. We conclude by speculating on some of the reasons for these differences and establishing an agenda for additional needed cross-jurisdictional research. In particular, we argue that it would be fruitful to undertake a cross-jurisdictional study that examines the development and effects of parental responsibility laws across a larger number of different Western countries as well as across individual states and provinces within these national jurisdictions.
Resumo:
We consider the problem of designing a surveillance system to detect a broad range of invasive species across a heterogeneous sampling frame. We present a model to detect a range of invertebrate invasives whilst addressing the challenges of multiple data sources, stratifying for differential risk, managing labour costs and providing sufficient power of detection.We determine the number of detection devices required and their allocation across the landscape within limiting resource constraints. The resulting plan will lead to reduced financial and ecological costs and an optimal surveillance system.
Resumo:
The World Wide Web has become a medium for people to share information. People use Web-based collaborative tools such as question answering (QA) portals, blogs/forums, email and instant messaging to acquire information and to form online-based communities. In an online QA portal, a user asks a question and other users can provide answers based on their knowledge, with the question usually being answered by many users. It can become overwhelming and/or time/resource consuming for a user to read all of the answers provided for a given question. Thus, there exists a need for a mechanism to rank the provided answers so users can focus on only reading good quality answers. The majority of online QA systems use user feedback to rank users’ answers and the user who asked the question can decide on the best answer. Other users who didn’t participate in answering the question can also vote to determine the best answer. However, ranking the best answer via this collaborative method is time consuming and requires an ongoing continuous involvement of users to provide the needed feedback. The objective of this research is to discover a way to recommend the best answer as part of a ranked list of answers for a posted question automatically, without the need for user feedback. The proposed approach combines both a non-content-based reputation method and a content-based method to solve the problem of recommending the best answer to the user who posted the question. The non-content method assigns a score to each user which reflects the users’ reputation level in using the QA portal system. Each user is assigned two types of non-content-based reputations cores: a local reputation score and a global reputation score. The local reputation score plays an important role in deciding the reputation level of a user for the category in which the question is asked. The global reputation score indicates the prestige of a user across all of the categories in the QA system. Due to the possibility of user cheating, such as awarding the best answer to a friend regardless of the answer quality, a content-based method for determining the quality of a given answer is proposed, alongside the non-content-based reputation method. Answers for a question from different users are compared with an ideal (or expert) answer using traditional Information Retrieval and Natural Language Processing techniques. Each answer provided for a question is assigned a content score according to how well it matched the ideal answer. To evaluate the performance of the proposed methods, each recommended best answer is compared with the best answer determined by one of the most popular link analysis methods, Hyperlink-Induced Topic Search (HITS). The proposed methods are able to yield high accuracy, as shown by correlation scores: Kendall correlation and Spearman correlation. The reputation method outperforms the HITS method in terms of recommending the best answer. The inclusion of the reputation score with the content score improves the overall performance, which is measured through the use of Top-n match scores.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
Credentials are a salient form of cultural capital and if a student’s learning and productions are not assessed, they are invisible in current social systems of education and employment. In this field, invisible equals non-existent. This paper arises from the context of an alternative education institution where conventional educational assessment techniques currently fail to recognise the creativity and skills of a cohort of marginalised young people. In order to facilitate a new assessment model an electronic portfolio system (EPS) is being developed and trialled to capture evidence of students’ learning and their productions. In so doing a dynamic system of arranging, exhibiting, exploiting and disseminating assessment data in the form of coherent, meaningful and valuable reports will be maintained. The paper investigates the notion of assessing development of creative thinking and skills through the means of a computerised system that operates in an area described as the efield. A model of the efield is delineated and is explained as a zone existing within the internet where free users exploit the cloud and cultivate social and cultural capital. Drawing largely on sociocultural theory and Bourdieu’s concepts of field, habitus and capitals, the article positions the efield as a potentially productive instrument in assessment for learning practices. An important aspect of the dynamics of this instrument is the recognition of teachers as learners. This is seen as an integral factor in the sociocultural approach to assessment for learning practices that will be deployed with the EPS. What actually takes place is argued to be assessment for learning as a field of exchange. The model produced in this research is aimed at delivering visibility and recognition through an engaging instrument that will enhance the prospects of marginalised young people and shift the paradigm for assessment in a creative world.
Resumo:
A wide range of screening strategies have been employed to isolate antibodies and other proteins with specific attributes, including binding affinity, specificity, stability and improved expression. However, there remains no high-throughput system to screen for target-binding proteins in a mammalian, intracellular environment. Such a system would allow binding reagents to be isolated against intracellular clinical targets such as cell signalling proteins associated with tumour formation (p53, ras, cyclin E), proteins associated with neurodegenerative disorders (huntingtin, betaamyloid precursor protein), and various proteins crucial to viral replication (e.g. HIV-1 proteins such as Tat, Rev and Vif-1), which are difficult to screen by phage, ribosome or cell-surface display. This study used the β-lactamase protein complementation assay (PCA) as the display and selection component of a system for screening a protein library in the cytoplasm of HEK 293T cells. The colicin E7 (ColE7) and Immunity protein 7 (Imm7) *Escherichia coli* proteins were used as model interaction partners for developing the system. These proteins drove effective β-lactamase complementation, resulting in a signal-to-noise ratio (9:1 – 13:1) comparable to that of other β-lactamase PCAs described in the literature. The model Imm7-ColE7 interaction was then used to validate protocols for library screening. Single positive cells that harboured the Imm7 and ColE7 binding partners were identified and isolated using flow cytometric cell sorting in combination with the fluorescent β-lactamase substrate, CCF2/AM. A single-cell PCR was then used to amplify the Imm7 coding sequence directly from each sorted cell. With the screening system validated, it was then used to screen a protein library based the Imm7 scaffold against a proof-of-principle target. The wild-type Imm7 sequence, as well as mutants with wild-type residues in the ColE7- binding loop were enriched from the library after a single round of selection, which is consistent with other eukaryotic screening systems such as yeast and mammalian cell-surface display. In summary, this thesis describes a new technology for screening protein libraries in a mammalian, intracellular environment. This system has the potential to complement existing screening technologies by allowing access to intracellular proteins and expanding the range of targets available to the pharmaceutical industry.
Resumo:
Patients with severe back deformities can greatly benefit from customized medical seating. Customized medical seating is made by taking measurements of each individual patient and making the seat as per these measurements. The current measuring systems employed by the industry are limited to use in clinics which are generally located only in major population centres. Patients living in remote areas are severely affected by this as the clinics could be far away and inaccessible for these patients. To provide service of customized medical seating requires a new measurement system which is portable so that the system could be transported to the patients in remote areas. The requirements for a new measurement system are analysed to suite the needs of Equipment Technology Services of the Cerebral Palsy League of Queensland. Design for a new measurement system was conceptualised by reviewing systems and technologies in various scientific disciplines. Design for a new system was finalised by optimizing each individual component. The final approach was validated by measuring difficult models and repeating the process to check for process variances. This system has now been adopted for clinical evaluation by ETS Suggestions have been made for further improvements in this new measurement approach.