965 resultados para formal methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The inquiry documented in this thesis is located at the nexus of technological innovation and traditional schooling. As we enter the second decade of a new century, few would argue against the increasingly urgent need to integrate digital literacies with traditional academic knowledge. Yet, despite substantial investments from governments and businesses, the adoption and diffusion of contemporary digital tools in formal schooling remain sluggish. To date, research on technology adoption in schools tends to take a deficit perspective of schools and teachers, with the lack of resources and teacher ‘technophobia’ most commonly cited as barriers to digital uptake. Corresponding interventions that focus on increasing funding and upskilling teachers, however, have made little difference to adoption trends in the last decade. Empirical evidence that explicates the cultural and pedagogical complexities of innovation diffusion within long-established conventions of mainstream schooling, particularly from the standpoint of students, is wanting. To address this knowledge gap, this thesis inquires into how students evaluate and account for the constraints and affordances of contemporary digital tools when they engage with them as part of their conventional schooling. It documents the attempted integration of a student-led Web 2.0 learning initiative, known as the Student Media Centre (SMC), into the schooling practices of a long-established, high-performing independent senior boys’ school in urban Australia. The study employed an ‘explanatory’ two-phase research design (Creswell, 2003) that combined complementary quantitative and qualitative methods to achieve both breadth of measurement and richness of characterisation. In the initial quantitative phase, a self-reported questionnaire was administered to the senior school student population to determine adoption trends and predictors of SMC usage (N=481). Measurement constructs included individual learning dispositions (learning and performance goals, cognitive playfulness and personal innovativeness), as well as social and technological variables (peer support, perceived usefulness and ease of use). Incremental predictive models of SMC usage were conducted using Classification and Regression Tree (CART) modelling: (i) individual-level predictors, (ii) individual and social predictors, and (iii) individual, social and technological predictors. Peer support emerged as the best predictor of SMC usage. Other salient predictors include perceived ease of use and usefulness, cognitive playfulness and learning goals. On the whole, an overwhelming proportion of students reported low usage levels, low perceived usefulness and a lack of peer support for engaging with the digital learning initiative. The small minority of frequent users reported having high levels of peer support and robust learning goal orientations, rather than being predominantly driven by performance goals. These findings indicate that tensions around social validation, digital learning and academic performance pressures influence students’ engagement with the Web 2.0 learning initiative. The qualitative phase that followed provided insights into these tensions by shifting the analytics from individual attitudes and behaviours to shared social and cultural reasoning practices that explain students’ engagement with the innovation. Six indepth focus groups, comprising 60 students with different levels of SMC usage, were conducted, audio-recorded and transcribed. Textual data were analysed using Membership Categorisation Analysis. Students’ accounts converged around a key proposition. The Web 2.0 learning initiative was useful-in-principle but useless-in-practice. While students endorsed the usefulness of the SMC for enhancing multimodal engagement, extending peer-topeer networks and acquiring real-world skills, they also called attention to a number of constraints that obfuscated the realisation of these design affordances in practice. These constraints were cast in terms of three binary formulations of social and cultural imperatives at play within the school: (i) ‘cool/uncool’, (ii) ‘dominant staff/compliant student’, and (iii) ‘digital learning/academic performance’. The first formulation foregrounds the social stigma of the SMC among peers and its resultant lack of positive network benefits. The second relates to students’ perception of the school culture as authoritarian and punitive with adverse effects on the very student agency required to drive the innovation. The third points to academic performance pressures in a crowded curriculum with tight timelines. Taken together, findings from both phases of the study provide the following key insights. First, students endorsed the learning affordances of contemporary digital tools such as the SMC for enhancing their current schooling practices. For the majority of students, however, these learning affordances were overshadowed by the performative demands of schooling, both social and academic. The student participants saw engagement with the SMC in-school as distinct from, even oppositional to, the conventional social and academic performance indicators of schooling, namely (i) being ‘cool’ (or at least ‘not uncool’), (ii) sufficiently ‘compliant’, and (iii) achieving good academic grades. Their reasoned response therefore, was simply to resist engagement with the digital learning innovation. Second, a small minority of students seemed dispositionally inclined to negotiate the learning affordances and performance constraints of digital learning and traditional schooling more effectively than others. These students were able to engage more frequently and meaningfully with the SMC in school. Their ability to adapt and traverse seemingly incommensurate social and institutional identities and norms is theorised as cultural agility – a dispositional construct that comprises personal innovativeness, cognitive playfulness and learning goals orientation. The logic then is ‘both and’ rather than ‘either or’ for these individuals with a capacity to accommodate both learning and performance in school, whether in terms of digital engagement and academic excellence, or successful brokerage across multiple social identities and institutional affiliations within the school. In sum, this study takes us beyond the familiar terrain of deficit discourses that tend to blame institutional conservatism, lack of resourcing and teacher resistance for low uptake of digital technologies in schools. It does so by providing an empirical base for the development of a ‘third way’ of theorising technological and pedagogical innovation in schools, one which is more informed by students as critical stakeholders and thus more relevant to the lived culture within the school, and its complex relationship to students’ lives outside of school. It is in this relationship that we find an explanation for how these individuals can, at the one time, be digital kids and analogue students.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presents a unified and systematic assessment of ten position control strategies for a hydraulic servo system with single-ended cylinder driven by a proportional directional control valve. We aim at identifying those methods that achieve better tracking, have a low sensitivity to system uncertainties, and offer a good balance between development effort and end results. A formal approach for solving this problem relies on several practical metrics, which is introduced herein. Their choice is important, as the comparison results between controllers can vary significantly, depending on the selected criterion. Apart from the quantitative assessment, we also raise aspects which are difficult to quantify, but which must stay in attention when considering the position control problem for this class of hydraulic servo systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bana et al. proposed the relation formal indistinguishability (FIR), i.e. an equivalence between two terms built from an abstract algebra. Later Ene et al. extended it to cover active adversaries and random oracles. This notion enables a framework to verify computational indistinguishability while still offering the simplicity and formality of symbolic methods. We are in the process of making an automated tool for checking FIR between two terms. First, we extend the work by Ene et al. further, by covering ordered sorts and simplifying the way to cope with random oracles. Second, we investigate the possibility of combining algebras together, since it makes the tool scalable and able to cover a wide class of cryptographic schemes. Specially, we show that the combined algebra is still computationally sound, as long as each algebra is sound. Third, we design some proving strategies and implement the tool. Basically, the strategies allow us to find a sequence of intermediate terms, which are formally indistinguishable, between two given terms. FIR between the two given terms is then guaranteed by the transitivity of FIR. Finally, we show applications of the work, e.g. on key exchanges and encryption schemes. In the future, the tool should be extended easily to cover many schemes. This work continues previous research of ours on use of compilers to aid in automated proofs for key exchange.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2009 the Australian Federal and State governments are expected to have spent some AU$30 billion procuring infrastructure projects. For governments with finite resources but many competing projects, formal capital rationing is achieved through use of Business Cases. These Business cases articulate the merits of investing in particular projects along with the estimated costs and risks of each project. Despite the sheer size and impact of infrastructure projects, there is very little research in Australia, or internationally, on the performance of these projects against Business Case assumptions when the decision to invest is made. If such assumptions (particularly cost assumptions) are not met, then there is serious potential for the misallocation of Australia’s finite financial resources. This research addresses this important gap in the literature by using combined quantitative and qualitative research methods, to examine the actual performance of 14 major Australian government infrastructure projects. The research findings are controversial as they challenge widely held perceptions of the effectiveness of certain infrastructure delivery practices. Despite this controversy, the research has had a significant impact on the field and has been described as ‘outstanding’ and ‘definitive’ (Alliancing Association of Australasia), "one of the first of its kind" (Infrastructure Partnerships of Australia) and "making a critical difference to infrastructure procurement" (Victorian Department of Treasury). The implications for practice of the research have been profound and included the withdrawal by Government of various infrastructure procurement guidelines, the formulation of new infrastructure policies by several state governments and the preparation of new infrastructure guidelines that substantially reflect the research findings. Building on the practical research, a more rigorous academic investigation focussed on the comparative cost uplift of various project delivery strategies was submitted to Australia’s premier academic management conference, the Australian and New Zealand Academy of Management (ANZAM) Annual Conference. This paper has been accepted for the 2010 ANZAM National Conference following a process of double blind peer review with reviewers rating the paper’s overall contribution as "Excellent" and "Good".

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Airports and cities inevitably recognise the value that each brings the other; however, the separation in decision-making authority for what to build, where, when and how provides a conundrum for both parties. Airports often want a say in what is developed outside of the airport fence, and cities often want a say in what is developed inside the airport fence. Defining how much of a say airports and cities have in decisions beyond their jurisdictional control is likely to be a topic that continues so long as airports and cities maintain separate formal decision-making processes for what to build, where, when and how. However, the recent Green and White Papers for a new National Aviation Policy have made early inroads to formalising relationships between Australia’s major airports and their host cities. At present, no clear indication (within practice or literature) is evident to the appropriateness of different governance arrangements for decisions to develop in situations that bring together the opposing strategic interests of airports and cities; thus leaving decisions for infrastructure development as complex decision-making spaces that hold airport and city/regional interests at stake. The line of enquiry is motivated by a lack of empirical research on networked decision-making domains outside of the realm of institutional theorists (Agranoff & McGuire, 2001; Provan, Fish & Sydow, 2007). That is, governance literature has remained focused towards abstract conceptualisations of organisation, without focusing on the minutia of how organisation influences action in real-world applications. A recent study by Black (2008) has provided an initial foothold for governance researchers into networked decision-making domains. This study builds upon Black’s (2008) work by aiming to explore and understand the problem space of making decisions subjected to complex jurisdictional and relational interdependencies. That is, the research examines the formal and informal structures, relationships, and forums that operationalise debates and interactions between decision-making actors as they vie for influence over deciding what to build, where, when and how in airport-proximal development projects. The research mobilises a mixture of qualitative and quantitative methods to examine three embedded cases of airport-proximal development from a network governance perspective. Findings from the research provide a new understanding to the ways in which informal actor networks underpin and combine with formal decision-making networks to create new (or realigned) governance spaces that facilitate decision-making during complex phases of development planning. The research is timely, and responds well to Isett, Mergel, LeRoux, Mischen and Rethemeyer’s (2011) recent critique of limitations within current network governance literature, specifically to their noted absence of empirical studies that acknowledge and interrogate the simultaneity of formal and informal network structures within network governance arrangements (Isett et al., 2011, pp. 162-166). The combination of social network analysis (SNA) techniques and thematic enquiry has enabled findings to document and interpret the ways in which decision-making actors organise to overcome complex problems for planning infrastructure. An innovative approach to using association networks has been used to provide insights to the importance of the different ways actors interact with one another, thus providing a simple yet valuable addition to the increasingly popular discipline of SNA. The research also identifies when and how different types of networks (i.e. formal and informal) are able to overcome currently known limitations to network governance (see McGuire & Agranoff, 2011), thus adding depth to the emerging body of network governance literature surrounding limitations to network ways of working (i.e. Rhodes, 1997a; Keast & Brown, 2002; Rethemeyer & Hatmaker, 2008; McGuire & Agranoff, 2011). Contributions are made to practice via the provision of a timely understanding of how horizontal fora between airports and their regions are used, particularly in the context of how they reframe the governance of decision-making for airport-proximal infrastructure development. This new understanding will enable government and industry actors to better understand the structural impacts of governance arrangements before they design or adopt them, particularly for factors such as efficiency of information, oversight, and responsiveness to change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parametric and generative modelling methods are ways in which computer models are made more flexible, and of formalising domain-specific knowledge. At present, no open standard exists for the interchange of parametric and generative information. The Industry Foundation Classes (IFC) which are an open standard for interoperability in building information models is presented as the base for an open standard in parametric modelling. The advantage of allowing parametric and generative representations are that the early design process can allow for more iteration and changes can be implemented quicker than with traditional models. This paper begins with a formal definition of what constitutes to be parametric and generative modelling methods and then proceeds to describe an open standard in which the interchange of components could be implemented. As an illustrative example of generative design, Frazer’s ‘Reptiles’ project from 1968 is reinterpreted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Informal caring networks contribute significantly to end-of-life (EOL) care in the community. However, to ensure that these networks are sustainable, and unpaid carers are not exploited, primary carers need permission and practical assistance to gather networks together and negotiate the help they need. Our aim in this study was to develop an understanding of how formal and informal carers work together when care is being provided in a dying person's home. We were particularly interested in formal providers’ perceptions and knowledge of informal networks of care and in identifying barriers to the networks working together. Methods Qualitative methods, informed by an interpretive approach, were used. In February-July 2012, 10 focus groups were conducted in urban, regional, and rural Australia comprising 88 participants. Findings Our findings show that formal providers are aware, and supportive, of the vital role informal networks play in the care of the dying at home. A number of barriers to formal and informal networks working together more effectively were identified. In particular, we found that the Australian policy of health-promoting palliative is not substantially translating to practice. Conclusion Combinations of formal and informal caring networks are essential to support people and their primary carers. Formal service providers do little to establish, support, or maintain the informal networks although there is much goodwill and scope for them to do so. Further re-orientation towards a health-promoting palliative care and community capacity building approach is suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Study/Objective This program of research examines the effectiveness of legal mechanisms as motivators to maximise engagement and compliance with evacuation messages. This study is based on the understanding that the presence of legislative requirements, as well as sanctions and incentives encapsulated in law, can have a positive impact in achieving compliance. Our objective is to examine whether the current Australian legal frameworks, which incorporate evacuation during disasters, are an effective structure that is properly understood by those who enforce and those who are required to comply. Background In Australia, most jurisdictions have enacted legislation that encapsulates the power to evacuate and the ability to enforce compliance, either by the use of force or imposition of penalty. However, citizens still choose to not evacuate. Methods This program of research incorporates theoretical and doctrinal methodologies for reviewing literature and legislation in the Australia context. The aim of the research is to determine whether further clarity is required to create an understanding of the powers to evacuate, as well as greater public awareness of these powers. Results & Conclusion Legislators suggest that powers of evacuation can be ineffective if they are impractical to enforce. In Australia, there may also be confusion about from which legislative instrument the power to evacuate derives, and therefore whether there is a corresponding ability to enforce compliance through the use of force or imposition of a penalty. Equally, communities may lack awareness and understanding of the powers of agencies to enforce compliance. We seek to investigate whether this is the case, and whether even if greater awareness existed, it would act as an incentive to comply.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the application of vector spaces over Galois fields, for obtaining a formal description of a picture in the form of a very compact, non-redundant, unique syntactic code. Two different methods of encoding are described. Both these methods consist in identifying the given picture as a matrix (called picture matrix) over a finite field. In the first method, the eigenvalues and eigenvectors of this matrix are obtained. The eigenvector expansion theorem is then used to reconstruct the original matrix. If several of the eigenvalues happen to be zero this scheme results in a considerable compression. In the second method, the picture matrix is reduced to a primitive diagonal form (Hermite canonical form) by elementary row and column transformations. These sequences of elementary transformations constitute a unique and unambiguous syntactic code-called Hermite code—for reconstructing the picture from the primitive diagonal matrix. A good compression of the picture results, if the rank of the matrix is considerably lower than its order. An important aspect of this code is that it preserves the neighbourhood relations in the picture and the primitive remains invariant under translation, rotation, reflection, enlargement and replication. It is also possible to derive the codes for these transformed pictures from the Hermite code of the original picture by simple algebraic manipulation. This code will find extensive applications in picture compression, storage, retrieval, transmission and in designing pattern recognition and artificial intelligence systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The binding behavior of two cationic dyes, brilliant cresyl blue (BCB) and methylene green (MG) to calf thymus DNA was studied by spectrophotometric and voltammetric methods. A red shift of the adsorption spectra and hypochromism accompany the binding of BCB and MG to calf thymus DNA. In 5 x 10(-2) mol dm(-3) NaCl, 5 x 10(-3) mol dm(-3) tris-HCl pH 6.87 buffer solution, the apparent binding constants are: K-BCB+ 3.0 x 10(4)M(-1) (N = 4.13) and K-MG+ = 8.8 x 10(4)M(-1) (n = 4.44). Electrochemical studies show that the formal potentials shift negatively upon addition of DNA, indicating that the oxidized forms of the dyes have stronger affinity to DNA than the reduced ones. K-BCB+/K-BCBH and K-MG+/K-MGH are evaluated to be 10.39 and 7.04. respectively. Our investigation suggests that the two cationic dyes interact with DNA predominantly via electrostatic interaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To compare the performance of formal prognostic instruments vs subjective clinical judgment with regards to predicting functional outcome in patients with spontaneous intracerebral hemorrhage (ICH). METHODS: This prospective observational study enrolled 121 ICH patients hospitalized at 5 US tertiary care centers. Within 24 hours of each patient's admission to the hospital, one physician and one nurse on each patient's clinical team were each asked to predict the patient's modified Rankin Scale (mRS) score at 3 months and to indicate whether he or she would recommend comfort measures. The admission ICH score and FUNC score, 2 prognostic scales selected for their common use in neurologic practice, were calculated for each patient. Spearman rank correlation coefficients (r) with respect to patients' actual 3-month mRS for the physician and nursing predictions were compared against the same correlation coefficients for the ICH score and FUNC score. RESULTS: The absolute value of the correlation coefficient for physician predictions with respect to actual outcome (0.75) was higher than that of either the ICH score (0.62, p = 0.057) or the FUNC score (0.56, p = 0.01). The nursing predictions of outcome (r = 0.72) also trended towards an accuracy advantage over the ICH score (p = 0.09) and FUNC score (p = 0.03). In an analysis that excluded patients for whom comfort care was recommended, the 65 available attending physician predictions retained greater accuracy (r = 0.73) than either the ICH score (r = 0.50, p = 0.02) or the FUNC score (r = 0.42, p = 0.004). CONCLUSIONS: Early subjective clinical judgment of physicians correlates more closely with 3-month outcome after ICH than prognostic scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study sought to explore ways to work with a group of young people through an arts-based approach to the teaching of literacy. Through the research, the author integrated her own reflexivity applying arts methods over the past decade. The author’s past experiences were strongly informed by theories such as caring theory and maternal pedagogy, which also informed the research design. The study incorporated qualitative data collection instruments comprising interviews, journals, sketches, artifacts, and teacher field notes. Data were collected by 3 student participants for the duration of the research. Study results provide educators with data on the impact of creating informal and alternative ways to teach literacy and maintain student engagement with resistant learners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ontologies have been established for knowledge sharing and are widely used as a means for conceptually structuring domains of interest. With the growing usage of ontologies, the problem of overlapping knowledge in a common domain becomes critical. In this short paper, we address two methods for merging ontologies based on Formal Concept Analysis: FCA-Merge and ONTEX. --- FCA-Merge is a method for merging ontologies following a bottom-up approach which offers a structural description of the merging process. The method is guided by application-specific instances of the given source ontologies. We apply techniques from natural language processing and formal concept analysis to derive a lattice of concepts as a structural result of FCA-Merge. The generated result is then explored and transformed into the merged ontology with human interaction. --- ONTEX is a method for systematically structuring the top-down level of ontologies. It is based on an interactive, top-down- knowledge acquisition process, which assures that the knowledge engineer considers all possible cases while avoiding redundant acquisition. The method is suited especially for creating/merging the top part(s) of the ontologies, where high accuracy is required, and for supporting the merging of two (or more) ontologies on that level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I have designed and implemented a system for the multilevel verification of synchronous MOS VLSI circuits. The system, called Silica Pithecus, accepts the schematic of an MOS circuit and a specification of the circuit's intended digital behavior. Silica Pithecus determines if the circuit meets its specification. If the circuit fails to meet its specification Silica Pithecus returns to the designer the reason for the failure. Unlike earlier verifiers which modelled primitives (e.g., transistors) as unidirectional digital devices, Silica Pithecus models primitives more realistically. Transistors are modelled as bidirectional devices of varying resistances, and nodes are modelled as capacitors. Silica Pithecus operates hierarchically, interactively, and incrementally. Major contributions of this research include a formal understanding of the relationship between different behavioral descriptions (e.g., signal, boolean, and arithmetic descriptions) of the same device, and a formalization of the relationship between the structure, behavior, and context of device. Given these formal structures my methods find sufficient conditions on the inputs of circuits which guarantee the correct operation of the circuit in the desired descriptive domain. These methods are algorithmic and complete. They also handle complex phenomena such as races and charge sharing. Informal notions such as races and hazards are shown to be derivable from the correctness conditions used by my methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los métodos disponibles para realizar análisis de descomposición que se pueden aplicar cuando los datos son completamente observados, no son válidos cuando la variable de interés es censurada. Esto puede explicar la escasez de este tipo de ejercicios considerando variables de duración, las cuales se observan usualmente bajo censura. Este documento propone un método del tipo Oaxaca-Blinder para descomponer diferencias en la media en el contexto de datos censurados. La validez de dicho método radica en la identificación y estimación de la distribución conjunta de la variable de duración y un conjunto de covariables. Adicionalmente, se propone un método más general que permite descomponer otros funcionales de interés como la mediana o el coeficiente de Gini, el cual se basa en la especificación de la función de distribución condicional de la variable de duración dado un conjunto de covariables. Con el fin de evaluar el desempeño de dichos métodos, se realizan experimentos tipo Monte Carlo. Finalmente, los métodos propuestos son aplicados para analizar las brechas de género en diferentes características de la duración del desempleo en España, tales como la duración media, la probabilidad de ser desempleado de largo plazo y el coeficiente de Gini. Los resultados obtenidos permiten concluir que los factores diferentes a las características observables, tales como capital humano o estructura del hogar, juegan un papel primordial para explicar dichas brechas.