65 resultados para Topologies on an arbitrary set
Resumo:
Objective: To develop a system for the automatic classification of pathology reports for Cancer Registry notifications. Method: A two pass approach is proposed to classify whether pathology reports are cancer notifiable or not. The first pass queries pathology HL7 messages for known report types that are received by the Queensland Cancer Registry (QCR), while the second pass aims to analyse the free text reports and identify those that are cancer notifiable. Cancer Registry business rules, natural language processing and symbolic reasoning using the SNOMED CT ontology were adopted in the system. Results: The system was developed on a corpus of 500 histology and cytology reports (with 47% notifiable reports) and evaluated on an independent set of 479 reports (with 52% notifiable reports). Results show that the system can reliably classify cancer notifiable reports with a sensitivity, specificity, and positive predicted value (PPV) of 0.99, 0.95, and 0.95, respectively for the development set, and 0.98, 0.96, and 0.96 for the evaluation set. High sensitivity can be achieved at a slight expense in specificity and PPV. Conclusion: The system demonstrates how medical free-text processing enables the classification of cancer notifiable pathology reports with high reliability for potential use by Cancer Registries and pathology laboratories.
Resumo:
Abstract: Australia’s ecosystems are the basis of our current and future prosperity, and our national well-being.A strong and sustainable Australian ecosystem science enterprise is vital for understanding and securing these ecosystems in the face of current and future challenges. This Plan defines the vision and key directions for a national ecosystem science capability that will enable Australia to understand and effectively manage its ecosystems for decades to come.The Plan’s underlying theme is that excellent science supports a range of activities, including public engagement, that enable us to understand and maintain healthy ecosystems.Those healthy ecosystems are the cornerstone of our social and economic well-being.The vision guiding the development of this Plan is that in 20 years’ time the status of Australian ecosystems and how they change will be widely reported and understood, and the prosperity and well-being they provide will be secure. To enable this, Australia’s national ecosystem science capability will be coordinated, collaborative and connected.The Plan is based on an extensive set of collaboratively generated proposals from national town hall meetings that also formthe basis for its implementation. Some directions within the Plan are for the Australian ecosystem science community itself to implement, others will involve the users of ecosystem science and the groups that fund ecosystem science.We identify six equal priority areas for action to achieve our vision: (i) delivering maximum impact for Australia: enhancing relationships between scientists and end-users; (ii) supporting long-termresearch; (iii) enabling ecosystem surveillance; (iv) making the most of data resources; (v) inspiring a generation: empowering the public with knowledge and opportunities; (vi) facilitating coordination, collaboration and leadership. This shared vision will enable us to consolidate our current successes, overcome remaining barriers and establish the foundations to ensure Australian ecosystem science delivers for the future needs of Australia..
Resumo:
This paper presents an uncertainty quantification study of the performance analysis of the high pressure ratio single stage radial-inflow turbine used in the Sundstrand Power Systems T-100 Multi-purpose Small Power Unit. A deterministic 3D volume-averaged Computational Fluid Dynamics (CFD) solver is coupled with a non-statistical generalized Polynomial Chaos (gPC) representation based on a pseudo-spectral projection method. One of the advantages of this approach is that it does not require any modification of the CFD code for the propagation of random disturbances in the aerodynamic and geometric fields. The stochastic results highlight the importance of the blade thickness and trailing edge tip radius on the total-to-static efficiency of the turbine compared to the angular velocity and trailing edge tip length. From a theoretical point of view, the use of the gPC representation on an arbitrary grid also allows the investigation of the sensitivity of the blade thickness profiles on the turbine efficiency. The gPC approach is also applied to coupled random parameters. The results show that the most influential coupled random variables are trailing edge tip radius coupled with the angular velocity.
Resumo:
To date, most applications of algebraic analysis and attacks on stream ciphers are on those based on lin- ear feedback shift registers (LFSRs). In this paper, we extend algebraic analysis to non-LFSR based stream ciphers. Specifically, we perform an algebraic analysis on the RC4 family of stream ciphers, an example of stream ciphers based on dynamic tables, and inves- tigate its implications to potential algebraic attacks on the cipher. This is, to our knowledge, the first pa- per that evaluates the security of RC4 against alge- braic attacks through providing a full set of equations that describe the complex word manipulations in the system. For an arbitrary word size, we derive alge- braic representations for the three main operations used in RC4, namely state extraction, word addition and state permutation. Equations relating the inter- nal states and keystream of RC4 are then obtained from each component of the cipher based on these al- gebraic representations, and analysed in terms of their contributions to the security of RC4 against algebraic attacks. Interestingly, it is shown that each of the three main operations contained in the components has its own unique algebraic properties, and when their respective equations are combined, the resulting system becomes infeasible to solve. This results in a high level of security being achieved by RC4 against algebraic attacks. On the other hand, the removal of an operation from the cipher could compromise this security. Experiments on reduced versions of RC4 have been performed, which confirms the validity of our algebraic analysis and the conclusion that the full RC4 stream cipher seems to be immune to algebraic attacks at present.
Resumo:
An efficient numerical method to compute nonlinear solutions for two-dimensional steady free-surface flow over an arbitrary channel bottom topography is presented. The approach is based on a boundary integral equation technique which is similar to that of Vanden-Broeck's (1996, J. Fluid Mech., 330, 339-347). The typical approach for this problem is to prescribe the shape of the channel bottom topography, with the free-surface being provided as part of the solution. Here we take an inverse approach and prescribe the shape of the free-surface a priori while solving for the corresponding bottom topography. We show how this inverse approach is particularly useful when studying topographies that give rise to wave-free solutions, allowing us to easily classify eleven basic flow types. Finally, the inverse approach is also adapted to calculate a distribution of pressure on the free-surface, given the free-surface shape itself.
Resumo:
An alternative approach to port decoupling and matching of arrays with tightly coupled elements is proposed. The method is based on the inherent decoupling effect obtained by feeding the orthogonal eigenmodes of the array. For this purpose, a modal feed network is connected to the array. The decoupled external ports of the feed network may then be matched independently by using conventional matching circuits. Such a system may be used in digital beam forming applications with good signal-to-noise performance. The theory is applicable to arrays with an arbitrary number of elements, but implementation is only practical for smaller arrays. The principle is illustrated by means of two examples.
Resumo:
The LiteSteel Beam (LSB) is a new hollow flange section developed by OneSteel Australian Tube Mills using their patented dual electric resistance welding and automated continuous roll-forming technologies. It has a unique geometry consisting of torsionally rigid rectangular hollow flanges and a relatively slender web. It has found increasing popularity in residential, industrial and commercial buildings as flexural members. The LSB is considerably lighter than traditional hot-rolled steel beams and provides both structural and construction efficiencies. However, the LSB flexural members are subjected to a relatively new lateral distortional buckling mode, which reduces their member moment capacities. Unlike the commonly observed lateral torsional buckling of steel beams, the lateral distortional buckling of LSBs is characterised by simultaneous lateral defection, twist and cross sectional change due to web distortion. The current design rules in AS/NZS 4600 (SA, 2005) for flexural members subject to lateral distortional buckling were found to be conservative by about 8% in the inelastic buckling region. Therefore, a new design rule was developed for LSBs subject to lateral distortional buckling based on finite element analyses of LSBs. The effect of section geometry was then considered and several geometrical parameters were used to develop an advanced set of design rules. This paper presents the details of the finite element analyses and the design curve development for hollow flange sections subject to lateral distortional buckling.
Resumo:
Purpose – The purpose of this paper is to examine the buyer awareness and acceptance of environmental and energy efficiency measures in the New Zealand residential property markets. This study aims to provide a greater understanding of consumer behaviour in the residential property market in relation to green housing issues ---------- Design/methodology/approach – The paper is based on an extensive survey of Christchurch real estate offices and was designed to gather data on the factors that were considered important by buyers in the residential property market. The survey was designed to allow these factors to be analysed on a socio-economic basis and to compare buyer behaviour based on property values. ---------- Findings – The results show that regardless of income levels, buyers still consider that the most important factor in the house purchase decision is the location of the property and price. Although the awareness of green housing issues and energy efficiency in housing is growing in the residential property market, it is only a major consideration for young and older buyers in the high income brackets and is only of some importance for all other buyer sectors of the residential property market. Many of the voluntary measures introduced by Governments to improve the energy efficiency of residential housing are still not considered important by buyers, indicating that a more mandatory approach may have to be undertaken to improve energy efficiency in the established housing market, as these measures are not valued by the buyer. ---------- Originality/value – The paper confirms the variations in real estate buyer behaviour across the full range of residential property markets and the acceptance and awareness of green housing issues and measures. These results would be applicable to most established and transparent residential property markets.
Resumo:
We present a novel method and instrument for in vivo imaging and measurement of the human corneal dynamics during an air puff. The instrument is based on high-speed swept source optical coherence tomography (ssOCT) combined with a custom adapted air puff chamber from a non-contact tonometer, which uses an air stream to deform the cornea in a non-invasive manner. During the short period of time that the deformation takes place, the ssOCT acquires multiple A-scans in time (M-scan) at the center of the air puff, allowing observation of the dynamics of the anterior and posterior corneal surfaces as well as the anterior lens surface. The dynamics of the measurement are driven by the biomechanical properties of the human eye as well as its intraocular pressure. Thus, the analysis of the M-scan may provide useful information about the biomechanical behavior of the anterior segment during the applanation caused by the air puff. An initial set of controlled clinical experiments are shown to comprehend the performance of the instrument and its potential applicability to further understand the eye biomechanics and intraocular pressure measurements. Limitations and possibilities of the new apparatus are discussed.
Resumo:
In recent years, development of Unmanned Aerial Vehicles (UAV) has become a significant growing segment of the global aviation industry. These vehicles are developed with the intention of operating in regions where the presence of onboard human pilots is either too risky or unnecessary. Their popularity with both the military and civilian sectors have seen the use of UAVs in a diverse range of applications, from reconnaissance and surveillance tasks for the military, to civilian uses such as aid relief and monitoring tasks. Efficient energy utilisation on an UAV is essential to its functioning, often to achieve the operational goals of range, endurance and other specific mission requirements. Due to the limitations of the space available and the mass budget on the UAV, it is often a delicate balance between the onboard energy available (i.e. fuel) and achieving the operational goals. This thesis presents an investigation of methods for increasing the energy efficiency on UAVs. One method is via the development of a Mission Waypoint Optimisation (MWO) procedure for a small fixed-wing UAV, focusing on improving the onboard fuel economy. MWO deals with a pre-specified set of waypoints by modifying the given waypoints within certain limits to achieve its optimisation objectives of minimising/maximising specific parameters. A simulation model of a UAV was developed in the MATLAB Simulink environment, utilising the AeroSim Blockset and the in-built Aerosonde UAV block and its parameters. This simulation model was separately integrated with a multi-objective Evolutionary Algorithm (MOEA) optimiser and a Sequential Quadratic Programming (SQP) solver to perform single-objective and multi-objective optimisation procedures of a set of real-world waypoints in order to minimise the onboard fuel consumption. The results of both procedures show potential in reducing fuel consumption on a UAV in a ight mission. Additionally, a parallel Hybrid-Electric Propulsion System (HEPS) on a small fixedwing UAV incorporating an Ideal Operating Line (IOL) control strategy was developed. An IOL analysis of an Aerosonde engine was performed, and the most efficient (i.e. provides greatest torque output at the least fuel consumption) points of operation for this engine was determined. Simulation models of the components in a HEPS were designed and constructed in the MATLAB Simulink environment. It was demonstrated through simulation that an UAV with the current HEPS configuration was capable of achieving a fuel saving of 6.5%, compared to the ICE-only configuration. These components form the basis for the development of a complete simulation model of a Hybrid-Electric UAV (HEUAV).
Resumo:
Two decades after its inception, Latent Semantic Analysis(LSA) has become part and parcel of every modern introduction to Information Retrieval. For any tool that matures so quickly, it is important to check its lore and limitations, or else stagnation will set in. We focus here on the three main aspects of LSA that are well accepted, and the gist of which can be summarized as follows: (1) that LSA recovers latent semantic factors underlying the document space, (2) that such can be accomplished through lossy compression of the document space by eliminating lexical noise, and (3) that the latter can best be achieved by Singular Value Decomposition. For each aspect we performed experiments analogous to those reported in the LSA literature and compared the evidence brought to bear in each case. On the negative side, we show that the above claims about LSA are much more limited than commonly believed. Even a simple example may show that LSA does not recover the optimal semantic factors as intended in the pedagogical example used in many LSA publications. Additionally, and remarkably deviating from LSA lore, LSA does not scale up well: the larger the document space, the more unlikely that LSA recovers an optimal set of semantic factors. On the positive side, we describe new algorithms to replace LSA (and more recent alternatives as pLSA, LDA, and kernel methods) by trading its l2 space for an l1 space, thereby guaranteeing an optimal set of semantic factors. These algorithms seem to salvage the spirit of LSA as we think it was initially conceived.
Resumo:
In recent years considerable attention has been paid to the numerical solution of stochastic ordinary differential equations (SODEs), as SODEs are often more appropriate than their deterministic counterparts in many modelling situations. However, unlike the deterministic case numerical methods for SODEs are considerably less sophisticated due to the difficulty in representing the (possibly large number of) random variable approximations to the stochastic integrals. Although Burrage and Burrage [High strong order explicit Runge-Kutta methods for stochastic ordinary differential equations, Applied Numerical Mathematics 22 (1996) 81-101] were able to construct strong local order 1.5 stochastic Runge-Kutta methods for certain cases, it is known that all extant stochastic Runge-Kutta methods suffer an order reduction down to strong order 0.5 if there is non-commutativity between the functions associated with the multiple Wiener processes. This order reduction down to that of the Euler-Maruyama method imposes severe difficulties in obtaining meaningful solutions in a reasonable time frame and this paper attempts to circumvent these difficulties by some new techniques. An additional difficulty in solving SODEs arises even in the Linear case since it is not possible to write the solution analytically in terms of matrix exponentials unless there is a commutativity property between the functions associated with the multiple Wiener processes. Thus in this present paper first the work of Magnus [On the exponential solution of differential equations for a linear operator, Communications on Pure and Applied Mathematics 7 (1954) 649-673] (applied to deterministic non-commutative Linear problems) will be applied to non-commutative linear SODEs and methods of strong order 1.5 for arbitrary, linear, non-commutative SODE systems will be constructed - hence giving an accurate approximation to the general linear problem. Secondly, for general nonlinear non-commutative systems with an arbitrary number (d) of Wiener processes it is shown that strong local order I Runge-Kutta methods with d + 1 stages can be constructed by evaluated a set of Lie brackets as well as the standard function evaluations. A method is then constructed which can be efficiently implemented in a parallel environment for this arbitrary number of Wiener processes. Finally some numerical results are presented which illustrate the efficacy of these approaches. (C) 1999 Elsevier Science B.V. All rights reserved.
Resumo:
The management and improvement of business processes are a core topic of the information systems discipline. The persistent demand in corporations within all industry sectors for increased operational efficiency and innovation, an emerging set of established and evaluated methods, tools, and techniques as well as the quickly growing body of academic and professional knowledge are indicative for the standing that Business Process Management (BPM) has nowadays. During the last decades, intensive research has been conducted with respect to the design, implementation, execution, and monitoring of business processes. Comparatively low attention, however, has been paid to questions related to organizational issues such as the adoption, usage, implications, and overall success of BPM approaches, technologies, and initiatives. This research gap motivated us to edit a corresponding special focus issue for the journal BISE/WIRTSCHAFTSINFORMATIK. We are happy that we are able to present a selection of three research papers and a state-of-the-art paper in the scientific section of the issue at hand. As these papers differ in the topics they investigate, the research method they apply, and the theoretical foundations they build on, the diversity within the BPM field becomes evident. The academic papers are complemented by an interview with Phil Gilbert, IBM’s Vice President for Business Process and Decision Management, who reflects on the relationship between business processes and the data flowing through them, the need to establish a process context for decision making, and the calibration of BPM efforts toward executives who see processes as a means to an end, rather than a first-order concept in its own right.
Resumo:
Internet services are important part of daily activities for most of us. These services come with sophisticated authentication requirements which may not be handled by average Internet users. The management of secure passwords for example creates an extra overhead which is often neglected due to usability reasons. Furthermore, password-based approaches are applicable only for initial logins and do not protect against unlocked workstation attacks. In this paper, we provide a non-intrusive identity verification scheme based on behavior biometrics where keystroke dynamics based-on free-text is used continuously for verifying the identity of a user in real-time. We improved existing keystroke dynamics based verification schemes in four aspects. First, we improve the scalability where we use a constant number of users instead of whole user space to verify the identity of target user. Second, we provide an adaptive user model which enables our solution to take the change of user behavior into consideration in verification decision. Next, we identify a new distance measure which enables us to verify identity of a user with shorter text. Fourth, we decrease the number of false results. Our solution is evaluated on a data set which we have collected from users while they were interacting with their mail-boxes during their daily activities.
Resumo:
Shared services have gained significance as an organizational arrangement, in particular for support functions, to reduce costs, increase quality and create new capabilities. The Information Systems (IS) function is amenable to sharing arrangements and information systems can enable sharing in other functional areas. However, despite being a promising area for IS research, literature on shared services in the IS discipline is scarce and scattered. There is still little consensus on what shared services is. Moreover, a thorough understanding of why shared services are adopted, who are involved, and how things are shared is lacking. In this article, we set out to progress IS research on shared services by establishing a common ground for future research and proposing a research agenda to shape the field based on an analysis of the IS literature. We present a holistic and inclusive definition, discuss the primacy of economic-strategic objectives so far, and introduce conceptual frameworks for stakeholders and the notion of sharing. We also provide an overview of the theories and research methods applied. We propose a research agenda that addresses fundamental issues related to objectives, stakeholders, and the notion of sharing to lay the foundation for taking IS research on shared services forward.