845 resultados para Symbolic and Algebraic Manipulation
Resumo:
The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.
Resumo:
Home Automation (HA) has emerged as a prominent ¯eld for researchers and in- vestors confronting the challenge of penetrating the average home user market with products and services emerging from technology based vision. In spite of many technology contri- butions, there is a latent demand for a®ordable and pragmatic assistive technologies for pro-active handling of complex lifestyle related problems faced by home users. This study has pioneered to develop an Initial Technology Roadmap for HA (ITRHA) that formulates a need based vision of 10-15 years, identifying market, product and technology investment opportunities, focusing on those aspects of HA contributing to e±cient management of home and personal life. The concept of Family Life Cycle is developed to understand the temporal needs of family. In order to formally describe a coherent set of family processes, their relationships, and interaction with external elements, a reference model named Fam- ily System is established that identi¯es External Entities, 7 major Family Processes, and 7 subsystems-Finance, Meals, Health, Education, Career, Housing, and Socialisation. Anal- ysis of these subsystems reveals Soft, Hard and Hybrid processes. Rectifying the lack of formal methods for eliciting future user requirements and reassessing evolving market needs, this study has developed a novel method called Requirement Elicitation of Future Users by Systems Scenario (REFUSS), integrating process modelling, and scenario technique within the framework of roadmapping. The REFUSS is used to systematically derive process au- tomation needs relating the process knowledge to future user characteristics identi¯ed from scenarios created to visualise di®erent futures with richly detailed information on lifestyle trends thus enabling learning about the future requirements. Revealing an addressable market size estimate of billions of dollars per annum this research has developed innovative ideas on software based products including Document Management Systems facilitating automated collection, easy retrieval of all documents, In- formation Management System automating information services and Ubiquitous Intelligent System empowering the highly mobile home users with ambient intelligence. Other product ideas include robotic devices of versatile Kitchen Hand and Cleaner Arm that can be time saving. Materialisation of these products require technology investment initiating further research in areas of data extraction, and information integration as well as manipulation and perception, sensor actuator system, tactile sensing, odour detection, and robotic controller. This study recommends new policies on electronic data delivery from service providers as well as new standards on XML based document structure and format.
Resumo:
The law and popular opinion expect boards of directors will actively monitor their organisations. Further, public opinion is that boards should have a positive impact on organisational performance. However, the processes of board monitoring and judgment are poorly understood, and board influence on organisational performance needs to be better understood. This thesis responds to the repeated calls to open the ‘black box’ linking board practices and organisational performance by investigating the processual behaviours of boards. The work of four boards1 of micro and small-sized nonprofit organisations were studied for periods of at least one year, using a processual research approach, drawing on observations of board meetings, interviews with directors, and the documents of the boards. The research shows that director turnover, the difficulty recruiting and engaging directors, and the administration of reporting, had strong impacts upon board monitoring, judging and/or influence. In addition, board monitoring of organisational performance was adversely affected by directors’ limited awareness of their legal responsibilities and directors’ limited financial literacy. Directors on average found all sources of information about their organisation’s work useful. Board judgments about the financial aspects of organisational performance were regulated by the routines of financial reporting. However, there were no comparable routines facilitating judgments about non-financial performance, and such judgments tended to be limited to specific aspects of performance and were ad hoc, largely in response to new information or the repackaging of existing information in a new form. The thesis argues that Weick’s theory of sensemaking offers insight into the way boards went about the task of understanding organisational performance. Board influence on organisational performance was demonstrated in the areas of: compliance; instrumental influence through service and through discussion and decision-making; and by symbolic, legitimating and protective means. The degree of instrumental influence achieved by boards depended on director competency, access to networks of influence, and understandings of board roles, and by the agency demonstrated by directors. The thesis concludes that there is a crowding out effect whereby CEO competence and capability limits board influence. The thesis also suggests that there is a second ‘agency problem’, a problem of director volition. The research potentially has profound implications for the work of nonprofit boards. Rather than purporting to establish a general theory of board governance, the thesis embraces calls to build situation-specific mini-theories about board behaviour.
Resumo:
Stream ciphers are encryption algorithms used for ensuring the privacy of digital telecommunications. They have been widely used for encrypting military communications, satellite communications, pay TV encryption and for voice encryption of both fixed lined and wireless networks. The current multi year European project eSTREAM, which aims to select stream ciphers suitable for widespread adoptation, reflects the importance of this area of research. Stream ciphers consist of a keystream generator and an output function. Keystream generators produce a sequence that appears to be random, which is combined with the plaintext message using the output function. Most commonly, the output function is binary addition modulo two. Cryptanalysis of these ciphers focuses largely on analysis of the keystream generators and of relationships between the generator and the keystream it produces. Linear feedback shift registers are widely used components in building keystream generators, as the sequences they produce are well understood. Many types of attack have been proposed for breaking various LFSR based stream ciphers. A recent attack type is known as an algebraic attack. Algebraic attacks transform the problem of recovering the key into a problem of solving multivariate system of equations, which eventually recover the internal state bits or the key bits. This type of attack has been shown to be effective on a number of regularly clocked LFSR based stream ciphers. In this thesis, algebraic attacks are extended to a number of well known stream ciphers where at least one LFSR in the system is irregularly clocked. Applying algebriac attacks to these ciphers has only been discussed previously in the open literature for LILI-128. In this thesis, algebraic attacks are first applied to keystream generators using stop-and go clocking. Four ciphers belonging to this group are investigated: the Beth-Piper stop-and-go generator, the alternating step generator, the Gollmann cascade generator and the eSTREAM candidate: the Pomaranch cipher. It is shown that algebraic attacks are very effective on the first three of these ciphers. Although no effective algebraic attack was found for Pomaranch, the algebraic analysis lead to some interesting findings including weaknesses that may be exploited in future attacks. Algebraic attacks are then applied to keystream generators using (p; q) clocking. Two well known examples of such ciphers, the step1/step2 generator and the self decimated generator are investigated. Algebraic attacks are shown to be very powerful attack in recovering the internal state of these generators. A more complex clocking mechanism than either stop-and-go or the (p; q) clocking keystream generators is known as mutual clock control. In mutual clock control generators, the LFSRs control the clocking of each other. Four well known stream ciphers belonging to this group are investigated with respect to algebraic attacks: the Bilateral-stop-and-go generator, A5/1 stream cipher, Alpha 1 stream cipher, and the more recent eSTREAM proposal, the MICKEY stream ciphers. Some theoretical results with regards to the complexity of algebraic attacks on these ciphers are presented. The algebraic analysis of these ciphers showed that generally, it is hard to generate the system of equations required for an algebraic attack on these ciphers. As the algebraic attack could not be applied directly on these ciphers, a different approach was used, namely guessing some bits of the internal state, in order to reduce the degree of the equations. Finally, an algebraic attack on Alpha 1 that requires only 128 bits of keystream to recover the 128 internal state bits is presented. An essential process associated with stream cipher proposals is key initialization. Many recently proposed stream ciphers use an algorithm to initialize the large internal state with a smaller key and possibly publicly known initialization vectors. The effect of key initialization on the performance of algebraic attacks is also investigated in this thesis. The relationships between the two have not been investigated before in the open literature. The investigation is conducted on Trivium and Grain-128, two eSTREAM ciphers. It is shown that the key initialization process has an effect on the success of algebraic attacks, unlike other conventional attacks. In particular, the key initialization process allows an attacker to firstly generate a small number of equations of low degree and then perform an algebraic attack using multiple keystreams. The effect of the number of iterations performed during key initialization is investigated. It is shown that both the number of iterations and the maximum number of initialization vectors to be used with one key should be carefully chosen. Some experimental results on Trivium and Grain-128 are then presented. Finally, the security with respect to algebraic attacks of the well known LILI family of stream ciphers, including the unbroken LILI-II, is investigated. These are irregularly clock- controlled nonlinear filtered generators. While the structure is defined for the LILI family, a particular paramater choice defines a specific instance. Two well known such instances are LILI-128 and LILI-II. The security of these and other instances is investigated to identify which instances are vulnerable to algebraic attacks. The feasibility of recovering the key bits using algebraic attacks is then investigated for both LILI- 128 and LILI-II. Algebraic attacks which recover the internal state with less effort than exhaustive key search are possible for LILI-128 but not for LILI-II. Given the internal state at some point in time, the feasibility of recovering the key bits is also investigated, showing that the parameters used in the key initialization process, if poorly chosen, can lead to a key recovery using algebraic attacks.
Resumo:
Computer aided technologies, medical imaging, and rapid prototyping has created new possibilities in biomedical engineering. The systematic variation of scaffold architecture as well as the mineralization inside a scaffold/bone construct can be studied using computer imaging technology and CAD/CAM and micro computed tomography (CT). In this paper, the potential of combining these technologies has been exploited in the study of scaffolds and osteochondral repair. Porosity, surface area per unit volume and the degree of interconnectivity were evaluated through imaging and computer aided manipulation of the scaffold scan data. For the osteochondral model, the spatial distribution and the degree of bone regeneration were evaluated. In this study the versatility of two softwares Mimics (Materialize), CTan and 3D realistic visualization (Skyscan) were assessed, too.
Resumo:
This work examines the algebraic cryptanalysis of small scale variants of the LEX-BES. LEX-BES is a stream cipher based on the Advanced Encryption Standard (AES) block cipher. LEX is a generic method proposed for constructing a stream cipher from a block cipher, initially introduced by Biryukov at eSTREAM, the ECRYPT Stream Cipher project in 2005. The Big Encryption System (BES) is a block cipher introduced at CRYPTO 2002 which facilitates the algebraic analysis of the AES block cipher. In this article, experiments were conducted to find solutions of equation systems describing small scale LEX-BES using Gröbner Basis computations. This follows a similar approach to the work by Cid, Murphy and Robshaw at FSE 2005 that investigated algebraic cryptanalysis on small scale variants of the BES. The difference between LEX-BES and BES is that due to the way the keystream is extracted, the number of unknowns in LEX-BES equations is fewer than the number in BES. As far as the authors know, this attempt is the first at creating solvable equation systems for stream ciphers based on the LEX method using Gröbner Basis computations.
Resumo:
Increasingly, celebrities appear not only as endorsers for products but are apparently engaged in entrepreneurial roles as initiators, owners and perhaps even managers in the ventures that market the products they promote. Despite being extensively referred to in popular media, scholars have been slow to recognise the importance of this new phenomenon. This thesis argues theoretically and shows empirically that celebrity entrepreneurs are more effective communicators than typical celebrity endorsers because of their increased engagement with ventures. I theorise that greater engagement increases the celebrity‘s emotional involvement as perceived by consumers. This is an endorser quality thus far neglected in the marketing communications literature. In turn, emotional involvement, much like the empirically established dimensions trustworthiness, expertise and attractiveness, should affect traditional outcome variables such as attitude towards the advertisement and brand. On the downside, increases in celebrity engagement may lead to relatively stronger and worsening changes in attitudes towards the brand if and when negative information about the celebrity is revealed. A series of eight experiments was conducted on 781 Swedish and Baltic students and 151 Swedish retirees. Though there were nuanced differences and additional complexities in each experiment, participants‘ reactions to advertisements containing a celebrity portrayed as a typical endorser or entrepreneur were recorded. The overall results of these experiments suggest that emotional involvement can be successfully operationalised as distinct from variables previously known to influence communication effectiveness. In addition, emotional involvement has positive effects on attitudes toward the advertisement and brand that are as strong as the predictors traditionally applied in the marketing communications literature. Moreover, the celebrity entrepreneur condition in the experimental manipulation consistently led to an increase in emotional involvement and to a lesser extent trustworthiness, but not expertise and attractiveness. Finally, negative celebrity information led to a change in participants‘ attitudes towards the brand which were more strongly negative for celebrity entrepreneurs than celebrity endorsers. In addition, the effect of negative celebrity information on a company‘s brand is worse when they support the celebrity rather than fire them. However, this effect did not appear to interact with the celebrity‘s purported engagement.
Resumo:
Prolific British author/illustrator Anthony Browne both participates in the classic fairy-tale tradition and appropriates its cultural capital, ultimately undertaking a process of self-canonisation alongside the dissemination of fairy tales. In reading Browne’s Hansel and Gretel (1981), The Tunnel (1989) and Into the Forest (2004), a trajectory emerges that moves from broadly intertextual to more exclusively self-referential modes of representation which reward readers of “Anthony Browne”, rather than readers of “fairy tales”. All three books depict ‘babes in the woods’ stories wherein child characters must negotiate some form of threat outside the home in order to return home safely. Thus, they represent childhood agency. However, these visions of agency are ultimately subordinated to logics of capital, which means that child readers of Browne’s fairy-tale books are overtly invited to identify with children who act, but are interpellated as privileged if they ‘know’. Bourdieu’s model of ‘cultural capital’ offers a lens for considering Browne’s production of ‘value’ for his own works within a broader cultural landscape which privileges literary fairy tales as a register of juvenile cultural competency. If cultural capital can be formulated most simply as the symbolic exchange value of approved modes of knowing and being, it is clearly helpful when trying to unpack logics of meaning within heavily intertextual or citational texts. It is also helpful thinking about what kinds of stories we as a culture choose to disseminate, choose to privilege, or choose to suppress. Zipes notes of fairy tales that, “the genre itself becomes a kind of institute that is involved in the socialization and acculturation of readers” (22). He elaborates that, “We initiate readers and expect them to learn the fairy-tale code as part of our responsibility in the civilizing process” (Zipes 29), so it is little wonder that Tatar describes fairy tales as “a vital part of our cultural capital” (xix). Although Browne is clearly interested in literary fairy tales, the most obvious strategies of self-canonisation take place in Browne’s work not in words but in pictures: hidden in plain sight, as illustration becomes self-reflexive citation.
Resumo:
Regenerative medicine techniques are currently being investigated to replace damaged cartilage. Critical to the success of these techniques is the ability to expand the initial population of cells while minimising de-differentiation to allow for hyaline cartilage to form. Three-dimensional culture systems have been shown to enhance the differentiation of chondrocytes in comparison to two-dimensional culture systems. Additionally, bioreactor expansion on microcarriers can provide mechanical stimulation and reduce the amount of cellular manipulation during expansion. The aim of this study was to characterise the expansion of human chondrocytes on microcarriers and to determine their potential to form cartilaginous tissue in vitro. High-grade human articular cartilage was obtained from leg amputations with ethics approval. Chondrocytes were isolated by collagenase digestion and expanded in either monolayers (104 cells/cm2) or on CultiSpher-G microcarriers (104 cells/mg) for three weeks. Following expansion, monolayer cells were passaged and cells on microcarriers were either left intact or the cells were released with trypsin/EDTA. Pellets from these three groups were formed and cultured for three weeks to establish the chondrogenic differentiation potential of monolayer-expanded and microcarrier-expanded chondrocytes. Cell viability, proliferation, glycosaminoglycan (GAG) accumulation, and collagen synthesis were assessed. Histology and immunohistochemistry were also performed. Human chondrocytes remained viable and expanded on microcarriers 10.2±2.6 fold in three weeks. GAG content significantly increased with time, with the majority of GAG found in the medium. Collagen production per nanogram DNA increased marginally during expansion. Histology revealed that chondrocytes were randomly distributed on microcarrier surfaces yet most pores remained cell free. Critically, human chondrocytes expanded on microcarriers maintained their ability to redifferentiate in pellet culture, as demonstrated by Safranin-O and collagen II staining. These data confirm the feasibility of microcarriers for passage-free cultivation of human articular chondrocytes. However, cell expansion needs to be improved, perhaps through growth factor supplementation, for clinical utility. Recent data indicate that cell-laden microcarriers can be used to seed fresh microcarriers, thereby increasing the expansion factor while minimising enzymatic passage.
Resumo:
Citizenship is a term of association among strangers. Access to it involves contested identities and symbolic meanings, differing power relations and strategies of inclusion, exclusion and action, and unequal room for maneuver or productivity in the uses of citizenship for any given group or individual. In the context of "rethinking communication," strenuous action is neede to associate such different life chances in a common enterprise at a national level or, more modestly, simply to claim equivalence for all such groups under the rule of one law.
Resumo:
Over the last three years, in our Early Algebra Thinking Project, we have been studying Years 3 to 5 students’ ability to generalise in a variety of situations, namely, compensation principles in computation, the balance principle in equivalence and equations, change and inverse change rules with function machines, and pattern rules with growing patterns. In these studies, we have attempted to involve a variety of models and representations and to build students’ abilities to switch between them (in line with the theories of Dreyfus, 1991, and Duval, 1999). The results have shown the negative effect of closure on generalisation in symbolic representations, the predominance of single variance generalisation over covariant generalisation in tabular representations, and the reduced ability to readily identify commonalities and relationships in enactive and iconic representations. This chapter uses the results to explore the interrelation between generalisation and verbal and visual comprehension of context. The studies evidence the importance of understanding and communicating aspects of representational forms which allowed commonalities to be seen across or between representations. Finally the chapter explores the implications of the studies for a theory that describes a growth in integration of models and representations that leads to generalisation.
Resumo:
High-density living in inner-urban areas has been promoted to encourage the use of more sustainable modes of travel to reduce greenhouse gas emissions. However, previous research presents mixed results on the relationship between living in proximity to transport systems and reduced car-dependency. This research examines inner-city residents’ transportation practices and perceptions, via 24 qualitative interviews with residents from high-density dwellings in inner-city Brisbane, Australia. Whilst participants consider public transport accessible and convenient, car use continues to be relied on for many journeys. Transportation choices are justified through complex definitions of convenience containing both utilitarian and psycho-social elements,with three key themes identified: time-efficiency, single versus multi-modal trips, and distance to and purpose of journey, as well as attitudinal, affective and symbolic elements related to transport mode use. Understanding conceptions of transport convenience held by different segments of the transport users market,alongside other factors strongly implicated in travel mode choice, can ensure targeted improvements in sustainable transport service levels and infrastructure as well as information service provision and behavioural change campaigns.