400 resultados para Science and Technology System


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Policy makers increasingly recognise that an educated workforce with a high proportion of Science, Technology, Engineering and Mathematics (STEM) graduates is a pre-requisite to a knowledge-based, innovative economy. Over the past ten years, the proportion of first university degrees awarded in Australia in STEM fields is below the global average and continues to decrease from 22.2% in 2002 to 18.8% in 2010 [1]. These trends are mirrored by declines between 20% and 30% in the proportions of high school students enrolled in science or maths. These trends are not unique to Australia but their impact is of concern throughout the policy-making community. To redress these demographic trends, QUT embarked upon a long-term investment strategy to integrate education and research into the physical and virtual infrastructure of the campus, recognising that expectations of students change as rapidly as technology and learning practices change. To implement this strategy, physical infrastructure refurbishment/re-building is accompanied by upgraded technologies not only for learning but also for research. QUT’s vision for its city-based campuses is to create vibrant and attractive places to learn and research and to link strongly to the wider surrounding community. Over a five year period, physical infrastructure at the Gardens Point campus was substantially reconfigured in two key stages: (a) a >$50m refurbishment of heritage-listed buildings to encompass public, retail and social spaces, learning and teaching “test beds” and research laboratories and (b) destruction of five buildings to be replaced by a $230m, >40,000m2 Science and Engineering Centre designed to accommodate retail, recreation, services, education and research in an integrated, coordinated precinct. This landmark project is characterised by (i) self-evident, collaborative spaces for learning, research and social engagement, (ii) sustainable building practices and sustainable ongoing operation and; (iii) dynamic and mobile re-configuration of spaces or staffing to meet demand. Innovative spaces allow for transformative, cohort-driven learning and the collaborative use of space to prosecute joint class projects. Research laboratories are aggregated, centralised and “on display” to the public, students and staff. A major visualisation space – the largest multi-touch, multi-user facility constructed to date – is a centrepiece feature that focuses on demonstrating scientific and engineering principles or science oriented scenes at large scale (e.g. the Great Barrier Reef). Content on this visualisation facility is integrated with the regional school curricula and supports an in-house schools program for student and teacher engagement. Researchers are accommodated in a combined open-plan and office floor-space (80% open plan) to encourage interdisciplinary engagement and cross-fertilisation of skills, ideas and projects. This combination of spaces re-invigorates the on-campus experience, extends educational engagement across all ages and rapidly enhances research collaboration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Streamciphers are common cryptographic algorithms used to protect the confidentiality of frame-based communications like mobile phone conversations and Internet traffic. Streamciphers are ideal cryptographic algorithms to encrypt these types of traffic as they have the potential to encrypt them quickly and securely, and have low error propagation. The main objective of this thesis is to determine whether structural features of keystream generators affect the security provided by stream ciphers.These structural features pertain to the state-update and output functions used in keystream generators. Using linear sequences as keystream to encrypt messages is known to be insecure. Modern keystream generators use nonlinear sequences as keystream.The nonlinearity can be introduced through a keystream generator's state-update function, output function, or both. The first contribution of this thesis relates to nonlinear sequences produced by the well-known Trivium stream cipher. Trivium is one of the stream ciphers selected in a final portfolio resulting from a multi-year project in Europe called the ecrypt project. Trivium's structural simplicity makes it a popular cipher to cryptanalyse, but to date, there are no attacks in the public literature which are faster than exhaustive keysearch. Algebraic analyses are performed on the Trivium stream cipher, which uses a nonlinear state-update and linear output function to produce keystream. Two algebraic investigations are performed: an examination of the sliding property in the initialisation process and algebraic analyses of Trivium-like streamciphers using a combination of the algebraic techniques previously applied separately by Berbain et al. and Raddum. For certain iterations of Trivium's state-update function, we examine the sets of slid pairs, looking particularly to form chains of slid pairs. No chains exist for a small number of iterations.This has implications for the period of keystreams produced by Trivium. Secondly, using our combination of the methods of Berbain et al. and Raddum, we analysed Trivium-like ciphers and improved on previous on previous analysis with regards to forming systems of equations on these ciphers. Using these new systems of equations, we were able to successfully recover the initial state of Bivium-A.The attack complexity for Bivium-B and Trivium were, however, worse than exhaustive keysearch. We also show that the selection of stages which are used as input to the output function and the size of registers which are used in the construction of the system of equations affect the success of the attack. The second contribution of this thesis is the examination of state convergence. State convergence is an undesirable characteristic in keystream generators for stream ciphers, as it implies that the effective session key size of the stream cipher is smaller than the designers intended. We identify methods which can be used to detect state convergence. As a case study, theMixer streamcipher, which uses nonlinear state-update and output functions to produce keystream, is analysed. Mixer is found to suffer from state convergence as the state-update function used in its initialisation process is not one-to-one. A discussion of several other streamciphers which are known to suffer from state convergence is given. From our analysis of these stream ciphers, three mechanisms which can cause state convergence are identified.The effect state convergence can have on stream cipher cryptanalysis is examined. We show that state convergence can have a positive effect if the goal of the attacker is to recover the initial state of the keystream generator. The third contribution of this thesis is the examination of the distributions of bit patterns in the sequences produced by nonlinear filter generators (NLFGs) and linearly filtered nonlinear feedback shift registers. We show that the selection of stages used as input to a keystream generator's output function can affect the distribution of bit patterns in sequences produced by these keystreamgenerators, and that the effect differs for nonlinear filter generators and linearly filtered nonlinear feedback shift registers. In the case of NLFGs, the keystream sequences produced when the output functions take inputs from consecutive register stages are less uniform than sequences produced by NLFGs whose output functions take inputs from unevenly spaced register stages. The opposite is true for keystream sequences produced by linearly filtered nonlinear feedback shift registers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was undertaken to examine the influence that a set of Professional Development (PD) initiatives had on faculty use of Moodle, a well known Course Management System. The context of the study was a private language university just outside Tokyo, Japan. Specifically, it aimed to identify the way in which the PD initiatives adhered to professional development best practice criteria; how faculty members perceived the PD initiatives; what impact the PD initiatives had on faculty use of Moodle; and other variables that may have influenced faculty in their use of Moodle. The study utilised a mixed methods approach. Participants in the study were 42 teachers who worked at the university in the academic year 2008/9. The online survey consisted of 115 items, factored into 10 constructs. Data was collected through an online survey, semi-structured face-to-face interviews, post-workshop surveys, and a collection of textual artefacts. The quantitative data were analysed in SPSS, using descriptive statistics, Spearman's Rank Order correlation tests and a Kruskal-Wallis means test. The qualitative data was used to develop and expand findings and ideas. The results indicated that the PD initiatives adhered closely to criteria posited in technology-related professional development best practice criteria. Further, results from the online survey, post workshop surveys, and follow up face-to-face interviews indicated that while the PD initiatives that were implemented were positively perceived by faculty, they did not have the anticipated impact on Moodle use among faculty. Further results indicated that other variables, such as perceptions of Moodle, and institutional issues, had a considerable influence on Moodle use. The findings of the study further strengthened the idea that the five variables Everett Rogers lists in his Diffusion of Innovations model, including perceived attributes of an innovation; type of innovation decision; communication channels; nature of the social system; extent of change agents' promotion efforts, most influence the adoption of an innovation. However, the results also indicated that some of the variables in Rogers' DOI seem to have more of an influence than others, particularly the perceived attributes of an innovation variable. In addition, the findings of the study could serve to inform universities that have Course Management Systems (CMS), such as Moodle, about how to utilise them most efficiently and effectively. The findings could also help to inform universities about how to help faculty members acquire the skills necessary to incorporate CMSs into curricula and teaching practice. A limitation of this study was the use of a non-randomised sample, which could appear to have limited the generalisations of the findings to this particular Japanese context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The continuous growth of the XML data poses a great concern in the area of XML data management. The need for processing large amounts of XML data brings complications to many applications, such as information retrieval, data integration and many others. One way of simplifying this problem is to break the massive amount of data into smaller groups by application of clustering techniques. However, XML clustering is an intricate task that may involve the processing of both the structure and the content of XML data in order to identify similar XML data. This research presents four clustering methods, two methods utilizing the structure of XML documents and the other two utilizing both the structure and the content. The two structural clustering methods have different data models. One is based on a path model and other is based on a tree model. These methods employ rigid similarity measures which aim to identifying corresponding elements between documents with different or similar underlying structure. The two clustering methods that utilize both the structural and content information vary in terms of how the structure and content similarity are combined. One clustering method calculates the document similarity by using a linear weighting combination strategy of structure and content similarities. The content similarity in this clustering method is based on a semantic kernel. The other method calculates the distance between documents by a non-linear combination of the structure and content of XML documents using a semantic kernel. Empirical analysis shows that the structure-only clustering method based on the tree model is more scalable than the structure-only clustering method based on the path model as the tree similarity measure for the tree model does not need to visit the parents of an element many times. Experimental results also show that the clustering methods perform better with the inclusion of the content information on most test document collections. To further the research, the structural clustering method based on tree model is extended and employed in XML transformation. The results from the experiments show that the proposed transformation process is faster than the traditional transformation system that translates and converts the source XML documents sequentially. Also, the schema matching process of XML transformation produces a better matching result in a shorter time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Commercially available instrumented treadmill systems that provide continuous measures of temporospatial gait parameters have recently become available for clinical gait analysis. This study evaluated the level of agreement between temporospatial gait parameters derived from a new instrumented treadmill, which incorporated a capacitance-based pressure array, with those measured by a conventional instrumented walkway (criterion standard). Methods Temporospatial gait parameters were estimated from 39 healthy adults while walking over an instrumented walkway (GAITRite®) and instrumented treadmill system (Zebris) at matched speed. Differences in temporospatial parameters derived from the two systems were evaluated using repeated measures ANOVA models. Pearson-product-moment correlations were used to investigate relationships between variables measured by each system. Agreement was assessed by calculating the bias and 95% limits of agreement. Results All temporospatial parameters measured via the instrumented walkway were significantly different from those obtained from the instrumented treadmill (P < .01). Temporospatial parameters derived from the two systems were highly correlated (r, 0.79–0.95). The 95% limits of agreement for temporal parameters were typically less than ±2% of gait cycle duration. However, 95% limits of agreement for spatial measures were as much as ±5 cm. Conclusions Differences in temporospatial parameters between systems were small but statistically significant and of similar magnitude to changes reported between shod and unshod gait in healthy young adults. Temporospatial parameters derived from an instrumented treadmill, therefore, are not representative of those obtained from an instrumented walkway and should not be interpreted with reference to literature on overground walking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Activists, Feminists, queer theorists, and those who live outside traditional gender narratives have long challenged the fixity of the sex and gender binaries. While the dominant Western paradigm posits sex and gender as natural and inherent, queer theory argues that sex and gender are socially constructed. This means that our ideas about sex and gender, and the concepts themselves, are shaped by particular social contexts. Questioning the nature of sex can be puzzling. After all, isn’t sex biology? Binary sex – male and female – was labelled as such by scientists based on existing binary categories and observations of hormones, genes, chromosomes, reproductive organs, genitals and other bodily elements. Binary sex is allocated at birth by genital appearance. Not everyone fits into these categories and this leads queer theorists, and others, to question the categories. Now, “some scientists are also starting to move away from the idea of biology as the fixed basis on which the social artefact of gender is built” (5). Making Girls and Boys: Inside the Science of Sex, by Jane McCredie, examines theories about gender roles and behaviours also considering those who don’t fit the arbitrary sex and gender binaries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Disproportionate representation of males and females in science courses and careers continues to be of concern. This article explores gender differences in Australian high school students’ perceptions of school science and their intentions to study university science courses. Nearly 3800 15-year-old students responded to a range of 5-point Likert items relating to intentions to study science at university, perceptions of career-related instrumental issues such as remuneration and job security, self-rated science ability and enjoyment of school science. Australian boys and girls reported enjoying science to a similar extent, however boys reported enjoying it more in relation to other subjects than did girls, and rated their ability in science compared to others in their class more highly than did girls. There was no significant difference between the mean responses of girls and boys to the item “It is likely I will choose a science-related university course when I leave school” and the strongest predictors of responses to this item were items relating to students’ liking for school science and awareness from school science of new and exciting jobs, followed by their perceived self-ability. These results are discussed in relation to socio-scientific values that interact with identity and career choices, employment prospects in science, and implications for science education.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research on the achievement and retention of female students in science and mathematics is located within a context of falling levels of participation in physical science and mathematics courses in Australian schools, and underrepresentation of females in some science, technology, engineering and mathematics (STEM) courses. The Interests and Recruitment in Science (IRIS) project is an international project that aims to contribute to understanding and improving recruitment, retention and gender equity in STEM higher education. Nearly 3500 first year students in 30 Australian universities responded to the IRIS survey of 5-point Likert items and open responses. This paper explores gender differences in first year university students’ responses to three questions about important influences on their course choice. The IRIS study found good teachers were rated highly by both males and females as influential in choosing STEM courses, and significantly higher numbers of females rated personal encouragement from senior high school science teacher as very important. In suggestions for addressing sex disparities in male-dominated STEM courses, more females indicated the importance of good teaching/encouragement and more females said (unspecified) encouragement. This study relates to the influence of school science teachers and results are discussed in relation to implications for science education.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports results from a study comparing teachers’ and students’ perceptions about the relative degree of influence parents, teachers, friends, older students and careers advisors have on students’ decisions about enrolling in non-compulsory high school science subjects. The comparison was carried out as part of the Choosing Science project - a large-scale Australian study of 15 year-old students’ experiences of school science and intentions regarding further participation. The study found that students considered their science teachers to have had the greatest influence, followed by parents and then friends. In contrast, however, science teachers believed their students to be most influenced in their decisions by friends and peers, followed by older students and siblings and parents, with teachers themselves having relatively little influence. Both groups believed that advice from careers advisors was of little influence. The findings are unique in the science education literature in providing an insight into differences and similarities in the perceptions of students and their teachers. In particular they indicate that teachers play a far greater role in students’ decisions about enrolling in science than they believe. This has important implications for science teachers and teacher educators in terms of appreciating their influence and applying it in ways that encourage participation in science courses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Integration of rooftop PVs and increasing peak demand in the residential distribution networks has resulted in unacceptable voltage profile. Curtailing PV generation to alleviate overvoltage problem and making regular network investment to cater peak demand is not always feasible. Reactive capability of the PV inverter can be a solution to address voltage dip and over voltage problems to some extent. This paper proposes an algorithm to utilize reactive capability of PV inverters and investigate their effectiveness on feeder length and R/X ratio of the line. Feeder loading level for a particular R/X ratio to have acceptable voltage profile is also investigated. Furthermore, the need of appropriate feeder distances and R/X ratio for acceptable voltage profile, which can be useful for suburban design and distribution planning, is explored.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An increasing number of countries are faced with an aging population increasingly needing healthcare services. For any e-health information system, the need for increased trust by such clients with potentially little knowledge of any security scheme involved is paramount. In addition notable scalability of any system has become a critical aspect of system design, development and ongoing management. Meanwhile cryptographic systems provide the security provisions needed for confidentiality, authentication, integrity and non-repudiation. Cryptographic key management, however, must be secure, yet efficient and effective in developing an attitude of trust in system users. Digital certificate-based Public Key Infrastructure has long been the technology of choice or availability for information security/assurance; however, there appears to be a notable lack of successful implementations and deployments globally. Moreover, recent issues with associated Certificate Authority security have damaged trust in these schemes. This paper proposes the adoption of a centralised public key registry structure, a non-certificate based scheme, for large scale e-health information systems. The proposed structure removes complex certificate management, revocation and a complex certificate validation structure while maintaining overall system security. Moreover, the registry concept may be easier for both healthcare professionals and patients to understand and trust.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective Evaluate the effectiveness and robustness of Anonym, a tool for de-identifying free-text health records based on conditional random fields classifiers informed by linguistic and lexical features, as well as features extracted by pattern matching techniques. De-identification of personal health information in electronic health records is essential for the sharing and secondary usage of clinical data. De-identification tools that adapt to different sources of clinical data are attractive as they would require minimal intervention to guarantee high effectiveness. Methods and Materials The effectiveness and robustness of Anonym are evaluated across multiple datasets, including the widely adopted Integrating Biology and the Bedside (i2b2) dataset, used for evaluation in a de-identification challenge. The datasets used here vary in type of health records, source of data, and their quality, with one of the datasets containing optical character recognition errors. Results Anonym identifies and removes up to 96.6% of personal health identifiers (recall) with a precision of up to 98.2% on the i2b2 dataset, outperforming the best system proposed in the i2b2 challenge. The effectiveness of Anonym across datasets is found to depend on the amount of information available for training. Conclusion Findings show that Anonym compares to the best approach from the 2006 i2b2 shared task. It is easy to retrain Anonym with new datasets; if retrained, the system is robust to variations of training size, data type and quality in presence of sufficient training data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Effective machine fault prognostic technologies can lead to elimination of unscheduled downtime and increase machine useful life and consequently lead to reduction of maintenance costs as well as prevention of human casualties in real engineering asset management. This paper presents a technique for accurate assessment of the remnant life of machines based on health state probability estimation technique and historical failure knowledge embedded in the closed loop diagnostic and prognostic system. To estimate a discrete machine degradation state which can represent the complex nature of machine degradation effectively, the proposed prognostic model employed a classification algorithm which can use a number of damage sensitive features compared to conventional time series analysis techniques for accurate long-term prediction. To validate the feasibility of the proposed model, the five different level data of typical four faults from High Pressure Liquefied Natural Gas (HP-LNG) pumps were used for the comparison of intelligent diagnostic test using five different classification algorithms. In addition, two sets of impeller-rub data were analysed and employed to predict the remnant life of pump based on estimation of health state probability using the Support Vector Machine (SVM) classifier. The results obtained were very encouraging and showed that the proposed prognostics system has the potential to be used as an estimation tool for machine remnant life prediction in real life industrial applications.