319 resultados para Common Fixed Point


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Design as seen from the designer's perspective is a series of amazing imaginative jumps or creative leaps. But design as seen by the design historian is a smooth progression or evolution of ideas that they seem self-evident and inevitable after the event. But the next step is anything but obvious for the artist/creator/inventor/designer stuck at that point just before the creative leap. They know where they have come from and have a general sense of where they are going, but often do not have a precise target or goal. This is why it is misleading to talk of design as a problem-solving activity - it is better defined as a problem-finding activity. This has been very frustrating for those trying to assist the design process with computer-based, problem-solving techniques. By the time the problem has been defined, it has been solved. Indeed the solution is often the very definition of the problem. Design must be creative-or it is mere imitation. But since this crucial creative leap seem inevitable after the event, the question must arise, can we find some way of searching the space ahead? Of course there are serious problems of knowing what we are looking for and the vastness of the search space. It may be better to discard altogether the term "searching" in the context of the design process: Conceptual analogies such as search, search spaces and fitness landscapes aim to elucidate the design process. However, the vastness of the multidimensional spaces involved make these analogies misguided and they thereby actually result in further confounding the issue. The term search becomes a misnomer since it has connotations that imply that it is possible to find what you are looking for. In such vast spaces the term search must be discarded. Thus, any attempt at searching for the highest peak in the fitness landscape as an optimal solution is also meaningless. Futhermore, even the very existence of a fitness landscape is fallacious. Although alternatives in the same region of the vast space can be compared to one another, distant alternatives will stem from radically different roots and will therefore not be comparable in any straightforward manner (Janssen 2000). Nevertheless we still have this tantalizing possibility that if a creative idea seems inevitable after the event, then somehow might the process be rserved? This may be as improbable as attempting to reverse time. A more helpful analogy is from nature, where it is generally assumed that the process of evolution is not long-term goal directed or teleological. Dennett points out a common minsunderstanding of Darwinism: the idea that evolution by natural selection is a procedure for producing human beings. Evolution can have produced humankind by an algorithmic process, without its being true that evolution is an algorithm for producing us. If we were to wind the tape of life back and run this algorithm again, the likelihood of "us" being created again is infinitesimally small (Gould 1989; Dennett 1995). But nevertheless Mother Nature has proved a remarkably successful, resourceful, and imaginative inventor generating a constant flow of incredible new design ideas to fire our imagination. Hence the current interest in the potential of the evolutionary paradigm in design. These evolutionary methods are frequently based on techniques such as the application of evolutionary algorithms that are usually thought of as search algorithms. It is necessary to abandon such connections with searching and see the evolutionary algorithm as a direct analogy with the evolutionary processes of nature. The process of natural selection can generate a wealth of alternative experiements, and the better ones survive. There is no one solution, there is no optimal solution, but there is continuous experiment. Nature is profligate with her prototyping and ruthless in her elimination of less successful experiments. Most importantly, nature has all the time in the world. As designers we cannot afford prototyping and ruthless experiment, nor can we operate on the time scale of the natural design process. Instead we can use the computer to compress space and time and to perform virtual prototyping and evaluation before committing ourselves to actual prototypes. This is the hypothesis underlying the evolutionary paradigm in design (1992, 1995).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

‘Adolescence’ has become increasingly recognised as a nebulous concept. Previous conceptualisations of adolescence have adopted a ‘deficit’ view, regarding teenagers as ‘unfinished’ adults. The deficit view of adolescence is highly problematic in an era where adulthood itself is difficult to define. The terms ‘kidult’ or ‘adultescent’ have emerged to describe adult-age people whose interests and priorities match those of their teenage counterparts. Rather than relying on ‘lock-step’ models of physical, cognitive and social growth put forward by developmental psychology, adolescence can be more usefully defined by looking at the common experiences of people in their teenage years. Common experiences arise at an institutional level; for example, all adolescents are treated as the same by legal and education systems. The transition from primary to secondary schooling is a milestone for all children, exposing them to a new type of educational environment. Shared experiences also arise from generational factors. Today’s adolescents belong to the millennial generation, characterised by technological competence, global perspectives, high susceptibility to media influence, individualisation and rapid interactions. This generation focuses on teamwork, achievement, modesty and good conduct, and has great potential for significant collective accomplishments. These generational factors challenge educators to provide relevant learning experiences for today’s students. Many classrooms still utilise textbook-based pedagogy more suited to previous generations, resulting in disengagement among millennial students. Curriculum content must also be tailored to generational needs. The rapid pace of change, as well as the fluidity of identity created by dissolving geographical and vocational boundaries, mean that the millennial generation will need more than a fixed set of skills and knowledge to enter adulthood. Teachers must enable their students to think like ‘expert novices’, adept at assimilating new concepts in depth and prepared to engage in lifelong learning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Health Information Systems (HIS) make extensive use of Information and Communication Technologies (ICT). The use of ICT aids in improving the quality and efficiency of healthcare services by making healthcare information available at the point of care (Goldstein, Groen, Ponkshe, and Wine, 2007). The increasing availability of healthcare data presents security and privacy issues which have not yet been fully addressed (Liu, Caelli, May, and Croll, 2008a). Healthcare organisations have to comply with the security and privacy requirements stated in laws, regulations and ethical standards, while managing healthcare information. Protecting the security and privacy of healthcare information is a very complex task (Liu, May, Caelli and Croll, 2008b). In order to simplify the complexity of providing security and privacy in HIS, appropriate information security services and mechanisms have to be implemented. Solutions at the application layer have already been implemented in HIS such as those existing in healthcare web services (Weaver et al., 2003). In addition, Discretionary Access Control (DAC) is the most commonly implemented access control model to restrict access to resources at the OS layer (Liu, Caelli, May, Croll and Henricksen, 2007a). Nevertheless, the combination of application security mechanisms and DAC at the OS layer has been stated to be insufficient in satisfying security requirements in computer systems (Loscocco et al., 1998). This thesis investigates the feasibility of implementing Security Enhanced Linux (SELinux) to enforce a Role-Based Access Control (RBAC) policy to help protect resources at the Operating System (OS) layer. SELinux provides Mandatory Access Control (MAC) mechanisms at the OS layer. These mechanisms can contain the damage from compromised applications and restrict access to resources according to the security policy implemented. The main contribution of this research is to provide a modern framework to implement and manage SELinux in HIS. The proposed framework introduces SELinux Profiles to restrict access permissions over the system resources to authorised users. The feasibility of using SELinux profiles in HIS was demonstrated through the creation of a prototype, which was submitted to various attack scenarios. The prototype was also subjected to testing during emergency scenarios, where changes to the security policies had to be made on the spot. Attack scenarios were based on vulnerabilities common at the application layer. SELinux demonstrated that it could effectively contain attacks at the application layer and provide adequate flexibility during emergency situations. However, even with the use of current tools, the development of SELinux policies can be very complex. Further research has to be made in order to simplify the management of SELinux policies and access permissions. In addition, SELinux related technologies, such as the Policy Management Server by Tresys Technologies, need to be researched in order to provide solutions at different layers of protection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cultural policy settings attempting to foster the growth and development of the Australian feature film industry in era of globalisation are coming under increasing pressure. Global forces and emerging production and distribution models are challenging the “narrowness” of cultural policy – mandating a particular film culture, circumscribing certain notions of value and limiting the variety of films produced through cultural policy driven subvention models. Australian horror film production is an important case study. Horror films are a production strategy well suited to the financial limitations of the Australian film industry with competitive advantages for producers against international competitors. However, emerging within a “national” cinema driven by public subsidy and social/cultural objectives, horror films – internationally oriented with a low-culture status – have been severely marginalised within public funding environments. This paper introduces Australian horror film production, and examines the limitations of cultural policy, and the impacts of these questions for the Producer Offset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The determination of the most appropriate procurement method for capital works projects is a challenging task for the Department of Housing and Works (DHW) and other Western Australian State Government Agencies because of the array of assessment criteria that are considered and the procurement methods that are available. A number of different procurement systems can be used to deliver capital works projects such a traditional, design and construct and management. Sub-classifications of these systems have proliferated and continue to emerge in response to market demands. The selection of an inappropriate procurement method may lead to undesirable project outcomes. To facilitate DHW in selecting an appropriate procurement method for its capital works projects, a six step procurement method selection process is presented. The characteristics of the most common forms of procurement method used in Australia are presented. Case studies where procurement methods have been used for specific types of capital works in Western Australia are offered to provide a reference point and learning opportunity for procurement method selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The project has further developed two programs for the industry partners related to service life prediction and salt deposition. The program for Queensland Department of Main Roads which predicts salt deposition on different bridge structures at any point in Queensland has been further refined by looking at more variables. It was found that the height of the bridge significantly affects the salt deposition levels only when very close to the coast. However the effect of natural cleaning of salt by rainfall was incorporated into the program. The user interface allows selection of a location in Queensland, followed by a bridge component. The program then predicts the annual salt deposition rate and rates the likely severity of the environment. The service life prediction program for the Queensland Department of Public Works has been expanded to include 10 common building components, in a variety of environments. Data mining procedures have been used to develop the program and increase the usefulness of the application. A Query Based Learning System (QBLS) has been developed which is based on a data-centric model with extensions to provide support for user interaction. The program is based on number of sources of information about the service life of building components. These include the Delphi survey, the CSIRO Holistic model and a school survey. During the project, the Holistic model was modified for each building component and databases generated for the locations of all Queensland schools. Experiments were carried out to verify and provide parameters for the modelling. These included instrumentation of a downpipe, measurements on pH and chloride levels in leaf litter, EIS measurements and chromate leaching from Colorbond materials and dose tests to measure corrosion rates of new materials. A further database was also generated for inclusion in the program through a large school survey. Over 30 schools in a range of environments from tropical coastal to temperate inland were visited and the condition of the building components rated on a scale of 0-5. The data was analysed and used to calculate an average service life for each component/material combination in the environments, where sufficient examples were available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper provides a fresh analysis of the widely-used Common Scrambling Algorithm Stream Cipher (CSA-SC). Firstly, a new representation of CSA-SC with a state size of only 89 bits is given, a significant reduction from the 103 bit state of a previous CSA-SC representation. Analysis of this 89-bit representation demonstrates that the basis of a previous guess-and-determine attack is flawed. Correcting this flaw increases the complexity of that attack so that it is worse than exhaustive key search. Although that attack is not feasible, the reduced state size of our representation makes it obvious that CSA-SC is vulnerable to several generic attacks, for which feasible parameters are given.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The common brown leafhopper, Orosius orientalis (Matsumura) (Homoptera: Cicadellidae), previously described as Orosius argentatus (Evans), is an important vector of several viruses and phytoplasmas worldwide. In Australia, phytoplasmas vectored by O. orientalis cause a range of economically important diseases, including legume little leaf (Hutton & Grylls, 1956), tomato big bud (Osmelak, 1986), lucerne witches broom (Helson, 1951), potato purple top wilt (Harding & Teakle, 1985), and Australian lucerne yellows (Pilkington et al., 2004). Orosius orientalis also transmits Tobacco yellow dwarf virus (TYDV; genus Mastrevirus, family Geminiviridae) to beans, causing bean summer death disease (Ballantyne, 1968), and to tobacco, causing tobacco yellow dwarf disease (Hill, 1937, 1941). TYDV has only been recorded in Australia to date. Both diseases result in significant production and quality losses (Ballantyne, 1968; Thomas, 1979; Moran & Rodoni, 1999). Although direct damage caused by leafhopper feeding has been observed, it is relatively minor compared to the losses resulting from disease (P Tr E bicki, unpubl.).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple mimetic of a heparan sulfate disaccharide sequence that binds to the growth factors FGF-1 and FGF-2 was synthesized by coupling a 2-azido-2-deoxy-D-glucosyl trichloroacetimidate donor with a 1,6-anhydro-2-azido-2-deoxy--D-glucose acceptor. Both the donor and acceptor were obtained from a common intermediate readily obtained from D-glucal. Molecular docking calculations showed that the predicted locations of the disaccharide sulfo groups in the binding site of FGF-1 and FGF-2 are similar to the positions observed for co-crystallized heparin-derived oligosaccharides obtained from published crystal structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper will describe a research project that examines the implications of multidisciplinary student cohorts on teaching and learning within undergraduate and postgraduate units in higher education. Whist students generally specialise in one discipline, it is also common that, at some point during their degree, they will choose to undertake subjects that are outside their specialist area. Students may choose a multidisciplinary learning experience either out of interest or because the subject is seen as complementary to their core discipline. When the lens of identity is applied to the multi-disciplinary cohorts in undergraduate and postgraduate units, it assists in identifying learning needs. The nature of disciplinarity, and the impact it has on students’ academic identity, presents challenges to both students and teachers when they engage in teaching and learning, impacting on curriculum design, assessment practices and teaching delivery strategies (Winberg, 2008). This project aims to identify the barriers that exist to effective teaching and learning in units that have multidisciplinary student cohorts. It will identify the particular needs of students in multidisciplinary student cohorts and determine a teaching and learning model that meets the needs of such cohorts. References Becher, T. & Trowler, P.R. (2001). Academic tribes and territories: Intellectual enquiry and the culture of the discipline. Buckingham, UK: Open University Press. Light, G. & Cox, R. (2001). Learning and teaching in higher education: A reflective professional. Thousand Oaks, CA: Sage. Neumann, R. (2001). Disciplinary differences and university teaching. Studies in Higher Education, 26 (2), 135-46. Neumann, R., Parry, S. & Becher, T. (2002). Teaching and Learning in their disciplinary contexts: A conceptual analysis. Studies in Higher Education, 27(4), 405-417. Taylor, P.G. (1999) Making Sense of Academic Life: Academics, Universities and Change. Buckingham, UK: Open University Press. Winberg, C. (2008). Teaching engineering/engineering teaching: interdisciplinary collaboration and the construction of academic identities. Teaching in Higher Education, 13(3), 353 - 367.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monitoring unused or dark IP addresses offers opportunities to extract useful information about both on-going and new attack patterns. In recent years, different techniques have been used to analyze such traffic including sequential analysis where a change in traffic behavior, for example change in mean, is used as an indication of malicious activity. Change points themselves say little about detected change; further data processing is necessary for the extraction of useful information and to identify the exact cause of the detected change which is limited due to the size and nature of observed traffic. In this paper, we address the problem of analyzing a large volume of such traffic by correlating change points identified in different traffic parameters. The significance of the proposed technique is two-fold. Firstly, automatic extraction of information related to change points by correlating change points detected across multiple traffic parameters. Secondly, validation of the detected change point by the simultaneous presence of another change point in a different parameter. Using a real network trace collected from unused IP addresses, we demonstrate that the proposed technique enables us to not only validate the change point but also extract useful information about the causes of change points.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose To assess the repeatability and validity of lens densitometry derived from the Pentacam Scheimpflug imaging system. Setting Eye Clinic, Queensland University of Technology, Brisbane, Australia. Methods This prospective cross-sectional study evaluated 1 eye of subjects with or without cataract. Scheimpflug measurements and slitlamp and retroillumination photographs were taken through a dilated pupil. Lenses were graded with the Lens Opacities Classification System III. Intraobserver and interobserver reliability of 3 observers performing 3 repeated Scheimpflug lens densitometry measurements each was assessed. Three lens densitometry metrics were evaluated: linear, for which a line was drawn through the visual axis and a mean lens densitometry value given; peak, which is the point at which lens densitometry is greatest on the densitogram; 3-dimensional (3D), in which a fixed, circular 3.0 mm area of the lens is selected and a mean lens densitometry value given. Bland and Altman analysis of repeatability for multiple measures was applied; results were reported as the repeatability coefficient and relative repeatability (RR). Results Twenty eyes were evaluated. Repeatability was high. Overall, interobserver repeatability was marginally lower than intraobserver repeatability. The peak was the least reliable metric (RR 37.31%) and 3D, the most reliable (RR 5.88%). Intraobserver and interobserver lens densitometry values in the cataract group were slightly less repeatable than in the noncataract group. Conclusion The intraobserver and interobserver repeatability of Scheimpflug lens densitometry was high in eyes with cataract and eyes without cataract, which supports the use of automated lens density scoring using the Scheimpflug system evaluated in the study

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a fixed-switching-frequency closed-loop modulation of a voltage-source inverter (VSI), upon the digital implementation of the modulation process, is analyzed and characterized. The sampling frequency of the digital processor is considered as an integer multiple of the modulation switching frequency. An expression for the determination of the modulation design parameter is developed for smooth modulation at a fixed switching frequency. The variation of the sampling frequency, switching frequency, and modulation index has been analyzed for the determination of the switching condition under closed loop. It is shown that the switching condition determined based on the continuous-time analysis of the closed-loop modulation will ensure smooth modulation upon the digital implementation of the modulation process. However, the stability properties need to be tested prior to digital implementation as they get deteriorated at smaller sampling frequencies. The closed-loop modulation index needs to be considered maximum while determining the design parameters for smooth modulation. In particular, a detailed analysis has been carried out by varying the control gain in the sliding-mode control of a two-level VSI. The proposed analysis of the closed-loop modulation of the VSI has been verified for the operation of a distribution static compensator. The theoretical results are validated experimentally on both single- and three-phase systems.