924 resultados para Java card
Resumo:
The draft of the first stage of the national curriculum has now been published. Its final form to be presented in December 2010 should be the centrepiece of Labor’s Educational Revolution. All the other aspects – personal computers, new school buildings, rebates for uniforms and even the MySchool report card – are marginal to the prescription of what is to be taught and learnt in schools. The seven authors in this journal’s Point and Counterpoint (Curriculum Perspectives, 30(1) 2010, pp.53-74) raise a number of both large and small issues in education as a whole, and in science education more particularly. Two of them (Groves and McGarry) make brief reference to earlier attempts to achieve national curriculum in Australia. Those writing from New Zealand and USA will be unaware of just how ambitious this project is for Australia - a bold and overdue educational adventure or a foolish political decision destined to failure, as happened in the later 1970s and the 1990s.
Resumo:
New substation automation applications, such as sampled value process buses and synchrophasors, require sampling accuracy of 1 µs or better. The Precision Time Protocol (PTP), IEEE Std 1588, achieves this level of performance and integrates well into Ethernet based substation networks. This paper takes a systematic approach to the performance evaluation of commercially available PTP devices (grandmaster, slave, transparent and boundary clocks) from a variety of manufacturers. The ``error budget'' is set by the performance requirements of each application. The ``expenditure'' of this error budget by each component is valuable information for a system designer. The component information is used to design a synchronization system that meets the overall functional requirements. The quantitative performance data presented shows that this testing is effective and informative. Results from testing PTP performance in the presence of sampled value process bus traffic demonstrate the benefit of a ``bottom up'' component testing approach combined with ``top down'' system verification tests. A test method that uses a precision Ethernet capture card, rather than dedicated PTP test sets, to determine the Correction Field Error of transparent clocks is presented. This test is particularly relevant for highly loaded Ethernet networks with stringent timing requirements. The methods presented can be used for development purposes by manufacturers, or by system integrators for acceptance testing. A sampled value process bus was used as the test application for the systematic approach described in this paper. The test approach was applied, components were selected, and the system performance verified to meet the application's requirements. Systematic testing, as presented in this paper, is applicable to a range of industries that use, rather than develop, PTP for time transfer.
Resumo:
This book is a reader for primary school students, stage 27-30 (fluent), incorporating mathematics themes. There is a fictional narrative, entitled "A Day at the Show", that describes the activities of Tess and Alex when visiting their local show. A non-fiction exposition, entitled "Supporting Australian Shows", explains more about Australian shows and why we should support them. Accompanying the book is a "building comprehension card" to assist teachers in their classroom use of the reader.
Resumo:
This book is a reader for primary school students, stage 24-26 (fluent), incorporating mathematics themes. There is a fictional narrative, entitled "A Flying Visit", that describes Tess' and Alex's encounter with the Flying Doctor. A non-fiction recount, entitled "Fundraising for the Flying Doctors", describes the activities of a class group in raising money for the Flying Doctors. Accompanying the book is a "building comprehension card" to assist teachers in their classroom use of the reader.
Resumo:
This paper describes in detail our Security-Critical Program Analyser (SCPA). SCPA is used to assess the security of a given program based on its design or source code with regard to data flow-based metrics. Furthermore, it allows software developers to generate a UML-like class diagram of their program and annotate its confidential classes, methods and attributes. SCPA is also capable of producing Java source code for the generated design of a given program. This source code can then be compiled and the resulting Java bytecode program can be used by the tool to assess the program's overall security based on our security metrics.
Resumo:
Refactoring is a common approach to producing better quality software. Its impact on many software quality properties, including reusability, maintainability and performance, has been studied and measured extensively. However, its impact on the information security of programs has received relatively little attention. In this work, we assess the impact of a number of the most common code-level refactoring rules on data security, using security metrics that are capable of measuring security from the viewpoint of potential information flow. The metrics are calculated for a given Java program using a static analysis tool we have developed to automatically analyse compiled Java bytecode. We ran our Java code analyser on various programs which were refactored according to each rule. New values of the metrics for the refactored programs then confirmed that the code changes had a measurable effect on information security.
Resumo:
A range of risk management initiatives have been introduced in organisations in attempt to reduce occupational road incidents. However a discrepancy exists between the initiatives that are frequently implemented in organisations and the initiatives that have demonstrated scientific merit in improving occupational road safety. Given that employees’ beliefs may facilitate or act as a barrier to implementing initiatives, it is important to understand whether initiatives with scientific merit are perceived to be effective by employees. To explore employee perceptions pertaining to occupational road safety initiatives, a questionnaire was administered to 679 employees sourced from four Australian organisations. Participants ranged in age from 18 years to 65 years (M = 42, SD = 11). Participants rated 35 initiatives based on how effective they thought they would be in improving road safety in their organisation. The initiatives perceived by employees to be most effective in managing occupational road risks comprised: making vehicle safety features standard e.g. passenger airbags; practical driver skills training; and investigation of serious vehicle incidents. The initiatives perceived to be least effective in managing occupational road risks comprised: signing a promise card commitment to drive safely; advertising the organisation’s phone number on vehicles for complaints and compliments; and consideration of driving competency in staff selection process. Employee perceptions were analysed at a factor level and at an initiative level. The mean scores for the three extracted factors revealed that employees believed occupational road risks could best be managed by the employer implementing engineering and human resource methods to enhance road safety. Initiatives relating to employer management of identified risk factors were perceived to be more effective than feedback or motivational methods that required employees to accept responsibility for their driving safety. Practitioners can use the findings from this study to make informed decisions about how they select, manage and market occupational safety initiatives.
Resumo:
Performance of urban transit systems may be quantified and assessed using transit capacity and productive capacity in planning, design and operational management activities. Bunker (4) defines important productive performance measures of an individual transit service and transit line, which are extended in this paper to quantify efficiency and operating fashion of transit services and lines. Comparison of a hypothetical bus line’s operation during a morning peak hour and daytime hour demonstrates the usefulness of productiveness efficiency and passenger transmission efficiency, passenger churn and average proportion line length traveled to the operator in understanding their services’ and lines’ productive performance, operating characteristics, and quality of service. Productiveness efficiency can flag potential pass-up activity under high load conditions, as well as ineffective resource deployment. Proportion line length traveled can directly measure operating fashion. These measures can be used to compare between lines/routes and, within a given line, various operating scenarios and time horizons to target improvements. The next research stage is investigating within-line variation using smart card passenger data and field observation of pass-ups. Insights will be used to further develop practical guidance to operators.
Resumo:
Performance of urban transit systems may be quantified and assessed using transit capacity and productive capacity in planning, design and operational management activities. Bunker (4) defines important productive performance measures of an individual transit service and transit line, which are extended in this paper to quantify efficiency and operating fashion of transit services and lines. Comparison of a hypothetical bus line’s operation during a morning peak hour and daytime hour demonstrates the usefulness of productiveness efficiency and passenger transmission efficiency, passenger churn and average proportion line length traveled to the operator in understanding their services’ and lines’ productive performance, operating characteristics, and quality of service. Productiveness efficiency can flag potential pass-up activity under high load conditions, as well as ineffective resource deployment. Proportion line length traveled can directly measure operating fashion. These measures can be used to compare between lines/routes and, within a given line, various operating scenarios and time horizons to target improvements. The next research stage is investigating within-line variation using smart card passenger data and field observation of pass-ups. Insights will be used to further develop practical guidance to operators.
Resumo:
In May 2011, the Centre for Crime and Justice Studies published Lessons for the Coalition: an end of term report on New Labour and Criminal Justice (Silvestri, 2011). In that collection I described Labour's performance on environmental issues as ‘too little too late’. The UK experienced a period of Blair/Brown environmental governance that demonstrated ‘symbolic success but real failure’. Amongst New Labour's environmental achievements were the establishment of the Climate Change Act 2008, the creation of the Department of Energy and Climate Change and the establishment of numerous green quangos to oversee and implement a range of environmental policies. However, these steps forward were seemingly threatened by the early days of a Cameron-led coalition where austerity measure, trade and the abolition of green quangos were on the cards. In sum, I concluded ‘future UK government report cards on the environment do not look good’ (Walters, 2011). After two and half years of a Conservative/Liberal Democratic coalition, and much rhetoric about it being ‘the greenest government ever’, the interim report card for the Cameron government on environmental matters is grim reading indeed. The demise of green quangos, record carbon emissions, renewable energies policies stultified, environmental criminality and victimisation all but ignored, and billions of pounds lost to environmental corporate fraudsters are just some of the headlines of Tory inspired governance with much environmental rhetoric and no environmental results.
Resumo:
miRDeep and its varieties are widely used to quantify known and novel micro RNA (miRNA) from small RNA sequencing (RNAseq). This article describes miRDeep*, our integrated miRNA identification tool, which is modeled off miRDeep, but the precision of detecting novel miRNAs is improved by introducing new strategies to identify precursor miRNAs. miRDeep* has a user-friendly graphic interface and accepts raw data in FastQ and Sequence Alignment Map (SAM) or the binary equivalent (BAM) format. Known and novel miRNA expression levels, as measured by the number of reads, are displayed in an interface, which shows each RNAseq read relative to the pre-miRNA hairpin. The secondary pre-miRNA structure and read locations for each predicted miRNA are shown and kept in a separate figure file. Moreover, the target genes of known and novel miRNAs are predicted using the TargetScan algorithm, and the targets are ranked according to the confidence score. miRDeep* is an integrated standalone application where sequence alignment, pre-miRNA secondary structure calculation and graphical display are purely Java coded. This application tool can be executed using a normal personal computer with 1.5 GB of memory. Further, we show that miRDeep* outperformed existing miRNA prediction tools using our LNCaP and other small RNAseq datasets. miRDeep* is freely available online at http://www.australianprostatecentre.org/research/software/mirdeep-star
Resumo:
J.W.Lindt’s Colonial man and Aborigine image from the GRAFTON ALBUM: “On chemistry and optics all does not depend, art must with these in triple union blend” (text from J.W. Lindt’s photographic backing card) In this paper, I follow an argument that Lindt held a position in his particular colonial environment where he was simultaneously both an insider and an outsider and that such a position may be considered prerequisite in stimulating exchange. A study of the transition of J.W. Lindt in Grafton, N.S.W. in the 1860s from a traveller to a migrant and subsequently to a professional photographer, as well as Lindt’s photographic career, which evolved through strategic action and technical approaches to photography, bears witness to his cultural relativity. One untitled photograph from this period of work constructs a unique commentary of Australian colonial life that illustrates a non-hegemonic position, particularly as it was included in one of the first albums of photographs of Aborigines that Lindt gifted to an illustrious person (in this case the Mayor of Grafton). As in his other studio constructions, props and backdrops were arranged and sitters were positioned with care, but this photograph is the only one in the album that includes a non-Aborigine in a relationship to an Aborigine. An analysis of the props, technical details of the album and the image suggests a reconciliatory aspect that thwarts the predominant attitudes towards Aborigines in the area at that time.
Resumo:
Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.
Resumo:
Our daily lives become more and more dependent upon smartphones due to their increased capabilities. Smartphones are used in various ways from payment systems to assisting the lives of elderly or disabled people. Security threats for these devices become increasingly dangerous since there is still a lack of proper security tools for protection. Android emerges as an open smartphone platform which allows modification even on operating system level. Therefore, third-party developers have the opportunity to develop kernel-based low-level security tools which is not normal for smartphone platforms. Android quickly gained its popularity among smartphone developers and even beyond since it bases on Java on top of "open" Linux in comparison to former proprietary platforms which have very restrictive SDKs and corresponding APIs. Symbian OS for example, holding the greatest market share among all smartphone OSs, was closing critical APIs to common developers and introduced application certification. This was done since this OS was the main target for smartphone malwares in the past. In fact, more than 290 malwares designed for Symbian OS appeared from July 2004 to July 2008. Android, in turn, promises to be completely open source. Together with the Linux-based smartphone OS OpenMoko, open smartphone platforms may attract malware writers for creating malicious applications endangering the critical smartphone applications and owners� privacy. In this work, we present our current results in analyzing the security of Android smartphones with a focus on its Linux side. Our results are not limited to Android, they are also applicable to Linux-based smartphones such as OpenMoko Neo FreeRunner. Our contribution in this work is three-fold. First, we analyze android framework and the Linux-kernel to check security functionalities. We survey wellaccepted security mechanisms and tools which can increase device security. We provide descriptions on how to adopt these security tools on Android kernel, and provide their overhead analysis in terms of resource usage. As open smartphones are released and may increase their market share similar to Symbian, they may attract attention of malware writers. Therefore, our second contribution focuses on malware detection techniques at the kernel level. We test applicability of existing signature and intrusion detection methods in Android environment. We focus on monitoring events on the kernel; that is, identifying critical kernel, log file, file system and network activity events, and devising efficient mechanisms to monitor them in a resource limited environment. Our third contribution involves initial results of our malware detection mechanism basing on static function call analysis. We identified approximately 105 Executable and Linking Format (ELF) executables installed to the Linux side of Android. We perform a statistical analysis on the function calls used by these applications. The results of the analysis can be compared to newly installed applications for detecting significant differences. Additionally, certain function calls indicate malicious activity. Therefore, we present a simple decision tree for deciding the suspiciousness of the corresponding application. Our results present a first step towards detecting malicious applications on Android-based devices.
Resumo:
Smartphones get increasingly popular where more and more smartphone platforms emerge. Special attention was gained by the open source platform Android which was presented by the Open Handset Alliance (OHA) hosting members like Google, Motorola, and HTC. Android uses a Linux kernel and a stripped-down userland with a custom Java VM set on top. The resulting system joins the advantages of both environments, while third-parties are intended to develop only Java applications at the moment. In this work, we present the benefit of using native applications in Android. Android includes a fully functional Linux, and using it for heavy computational tasks when developing applications can bring in substantional performance increase. We present how to develop native applications and software components, as well as how to let Linux applications and components communicate with Java programs. Additionally, we present performance measurements of native and Java applications executing identical tasks. The results show that native C applications can be up to 30 times as fast as an identical algorithm running in Dalvik VM. Java applications can become a speed-up of up to 10 times if utilizing JNI.