9 resultados para Computers.

em CORA - Cork Open Research Archive - University College Cork - Ireland


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Electronic assistive technology (EAT) includes computers, environmental control systems and information technology systems and is widely considered to be an important part of present-day life. Method: Fifty-six Irish community occupational therapists completed a questionnaire on EAT. All surveyed were able to identify the benefits of EAT. Results: While respondents reported that they should be able to assess for and prescribe EATs, only a third (19) were able to do so, and half (28) had not been able to do so in the past. Community occupational therapists identified themselves as havinga role in a multidisciplinary team to assess for and prescribe EAT. Conclusion: Results suggest that it is important for occupational therapists to have up-to-date knowledge and training in assistive and computer technologies in order to respond to the occupational needs of clients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accepted Version

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motivated by accurate average-case analysis, MOdular Quantitative Analysis (MOQA) is developed at the Centre for Efficiency Oriented Languages (CEOL). In essence, MOQA allows the programmer to determine the average running time of a broad class of programmes directly from the code in a (semi-)automated way. The MOQA approach has the property of randomness preservation which means that applying any operation to a random structure, results in an output isomorphic to one or more random structures, which is key to systematic timing. Based on original MOQA research, we discuss the design and implementation of a new domain specific scripting language based on randomness preserving operations and random structures. It is designed to facilitate compositional timing by systematically tracking the distributions of inputs and outputs. The notion of a labelled partial order (LPO) is the basic data type in the language. The programmer uses built-in MOQA operations together with restricted control flow statements to design MOQA programs. This MOQA language is formally specified both syntactically and semantically in this thesis. A practical language interpreter implementation is provided and discussed. By analysing new algorithms and data restructuring operations, we demonstrate the wide applicability of the MOQA approach. Also we extend MOQA theory to a number of other domains besides average-case analysis. We show the strong connection between MOQA and parallel computing, reversible computing and data entropy analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technological role of handheld devices is fundamentally changing. Portable computers were traditionally application specific. They were designed and optimised to deliver a specific task. However, it is now commonly acknowledged that future handheld devices need to be multi-functional and need to be capable of executing a range of high-performance applications. This thesis has coined the term pervasive handheld computing systems to refer to this type of mobile device. Portable computers are faced with a number of constraints in trying to meet these objectives. They are physically constrained by their size, their computational power, their memory resources, their power usage, and their networking ability. These constraints challenge pervasive handheld computing systems in achieving their multi-functional and high-performance requirements. This thesis proposes a two-pronged methodology to enable pervasive handheld computing systems meet their future objectives. The methodology is a fusion of two independent and yet complementary concepts. The first step utilises reconfigurable technology to enhance the physical hardware resources within the environment of a handheld device. This approach recognises that reconfigurable computing has the potential to dynamically increase the system functionality and versatility of a handheld device without major loss in performance. The second step of the methodology incorporates agent-based middleware protocols to support handheld devices to effectively manage and utilise these reconfigurable hardware resources within their environment. The thesis asserts the combined characteristics of reconfigurable computing and agent technology can meet the objectives of pervasive handheld computing systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Great demand in power optimized devices shows promising economic potential and draws lots of attention in industry and research area. Due to the continuously shrinking CMOS process, not only dynamic power but also static power has emerged as a big concern in power reduction. Other than power optimization, average-case power estimation is quite significant for power budget allocation but also challenging in terms of time and effort. In this thesis, we will introduce a methodology to support modular quantitative analysis in order to estimate average power of circuits, on the basis of two concepts named Random Bag Preserving and Linear Compositionality. It can shorten simulation time and sustain high accuracy, resulting in increasing the feasibility of power estimation of big systems. For power saving, firstly, we take advantages of the low power characteristic of adiabatic logic and asynchronous logic to achieve ultra-low dynamic and static power. We will propose two memory cells, which could run in adiabatic and non-adiabatic mode. About 90% dynamic power can be saved in adiabatic mode when compared to other up-to-date designs. About 90% leakage power is saved. Secondly, a novel logic, named Asynchronous Charge Sharing Logic (ACSL), will be introduced. The realization of completion detection is simplified considerably. Not just the power reduction improvement, ACSL brings another promising feature in average power estimation called data-independency where this characteristic would make power estimation effortless and be meaningful for modular quantitative average case analysis. Finally, a new asynchronous Arithmetic Logic Unit (ALU) with a ripple carry adder implemented using the logically reversible/bidirectional characteristic exhibiting ultra-low power dissipation with sub-threshold region operating point will be presented. The proposed adder is able to operate multi-functionally.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Petrochemical plastics/polymers are a common feature of day to day living as they occur in packaging, furniture, mobile phones, computers, construction equipment etc. However, these materials are produced from non-renewable materials and are resistant to microbial degradation in the environment. Considerable research has therefore been carried out into the production of sustainable, biodegradable polymers, amenable to microbial catabolism to CO2 and H2O. A key group of microbial polyesters, widely considered as optimal replacement polymers, are the Polyhydroxyalkaonates (PHAs). Primary research in this area has focused on using recombinant pure cultures to optimise PHA yields, however, despite considerable success, the high costs of pure culture fermentation have thus far hindered the commercial viability of PHAs thus produced. In more recent years work has begun to focus on mixed cultures for the optimisation of PHA production, with waste incorporations offering optimal production cost reductions. The scale of dairy processing in Ireland, and the high organic load wastewaters generated, represent an excellent potential substrate for bioconversion to PHAs in a mixed culture system. The current study sought to investigate the potential for such bioconversion in a laboratory scale biological system and to establish key operational and microbial characteristics of same. Two sequencing batch reactors were set up and operated along the lines of an enhanced biological phosphate removal (EBPR) system, which has PHA accumulation as a key step within repeated rounds of anaerobic/aerobic cycling. Influents to the reactors varied only in the carbon sources provided. Reactor 1 received artificial wastewater with acetate alone, which is known to be readily converted to PHA in the anaerobic step of EBPR. Reactor 2 wastewater influent contained acetate and skim milk to imitate a dairy processing effluent. Chemical monitoring of nutrient remediation within the reactors as continuously applied and EBPR consistent performances observed. Qualitative analysis of the sludge was carried out using fluorescence microscopy with Nile Blue A lipophillic stain and PHA production was confirmed in both reactors. Quantitative analysis via HPLC detection of crotonic acid derivatives revealed the fluorescence to be short chain length Polyhydroxybutyrate, with biomass dry weight accumulations of 11% and 13% being observed in reactors 1 and 2, respectively. Gas Chromatography-Mass Spectrometry for medium chain length methyl ester derivatives revealed the presence of hydroxyoctanoic, -decanoic and -dodecanoic acids in reactor 1. Similar analyses in reactor 2 revealed monomers of 3-hydroxydodecenoic and 3-hydroxytetradecanoic acids. Investigation of the microbial ecology of both reactors as conducted in an attempt to identify key species potentially contributing to reactor performance. Culture dependent investigations indicated that quite different communities were present in both reactors. Reactor 1 isolates demonstrated the following species distributions Pseudomonas (82%), Delftia acidovorans (3%), Acinetobacter sp. (5%) Aminobacter sp., (3%) Bacillus sp. (3%), Thauera sp., (3%) and Cytophaga sp. (3%). Relative species distributions among reactor 2 profiled isolates were more evenly distributed between Pseudoxanthomonas (32%), Thauera sp (24%), Acinetobacter (24%), Citrobacter sp (8%), Lactococcus lactis (5%), Lysinibacillus (5%) and Elizabethkingia (2%). In both reactors Gammaproteobacteria dominated the cultured isolates. Culture independent 16S rRNA gene analyses revealed differing profiles for both reactors. Reactor 1 clone distribution was as follows; Zooglea resiniphila (83%), Zooglea oryzae (2%), Pedobacter composti (5%), Neissericeae sp. (2%) Rhodobacter sp. (2%), Runella defluvii (3%) and Streptococcus sp. (3%). RFLP based species distribution among the reactor 2 clones was as follows; Runella defluvii (50%), Zoogloea oryzae (20%), Flavobacterium sp. (9%), Simplicispira sp. (6%), Uncultured Sphingobacteria sp. (6%), Arcicella (6%) and Leadbetterella bysophila (3%). Betaproteobacteria dominated the 16S rRNA gene clones identified in both reactors. FISH analysis with Nile Blue dual staining resolved these divergent findings, identifying the Betaproteobacteria as dominant PHA accumulators within the reactor sludges, although species/strain specific allocations could not be made. GC analysis of the sludge had indicated the presence of both medium chain length as well short chain length PHAs accumulating in both reactors. In addition the cultured isolates from the reactors had been identified previously as mcl and scl PHA producers, respectively. Characterisations of the PHA monomer profiles of the individual isolates were therefore performed to screen for potential novel scl-mcl PHAs. Nitrogen limitation driven PHA accumulation in E2 minimal media revealed a greater propensity among isoates for mcl-pHA production. HPLC analysis indicated that PHB production was not a major feature of the reactor isolates and this was supported by the low presence of scl phaC1 genes among PCR screened isolates. A high percentage distribution of phaC2 mcl-PHA synthase genes was recorded, with the majority sharing high percentage homology with class II synthases from Pseudomonas sp. The common presence of a phaC2 homologue was not reflected in the production of a common polymer. Considerable variation was noted in both the monomer composition and ratios following GC analysis. While co-polymer production could not be demonstrated, potentially novel synthase substrate specificities were noted which could be exploited further in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work considers the static calculation of a program’s average-case time. The number of systems that currently tackle this research problem is quite small due to the difficulties inherent in average-case analysis. While each of these systems make a pertinent contribution, and are individually discussed in this work, only one of them forms the basis of this research. That particular system is known as MOQA. The MOQA system consists of the MOQA language and the MOQA static analysis tool. Its technique for statically determining average-case behaviour centres on maintaining strict control over both the data structure type and the labeling distribution. This research develops and evaluates the MOQA language implementation, and adds to the functions already available in this language. Furthermore, the theory that backs MOQA is generalised and the range of data structures for which the MOQA static analysis tool can determine average-case behaviour is increased. Also, some of the MOQA applications and extensions suggested in other works are logically examined here. For example, the accuracy of classifying the MOQA language as reversible is investigated, along with the feasibility of incorporating duplicate labels into the MOQA theory. Finally, the analyses that take place during the course of this research reveal some of the MOQA strengths and weaknesses. This thesis aims to be pragmatic when evaluating the current MOQA theory, the advancements set forth in the following work and the benefits of MOQA when compared to similar systems. Succinctly, this work’s significant expansion of the MOQA theory is accompanied by a realistic assessment of MOQA’s accomplishments and a serious deliberation of the opportunities available to MOQA in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For at least two millennia and probably much longer, the traditional vehicle for communicating geographical information to end-users has been the map. With the advent of computers, the means of both producing and consuming maps have radically been transformed, while the inherent nature of the information product has also expanded and diversified rapidly. This has given rise in recent years to the new concept of geovisualisation (GVIS), which draws on the skills of the traditional cartographer, but extends them into three spatial dimensions and may also add temporality, photorealistic representations and/or interactivity. Demand for GVIS technologies and their applications has increased significantly in recent years, driven by the need to study complex geographical events and in particular their associated consequences and to communicate the results of these studies to a diversity of audiences and stakeholder groups. GVIS has data integration, multi-dimensional spatial display advanced modelling techniques, dynamic design and development environments and field-specific application needs. To meet with these needs, GVIS tools should be both powerful and inherently usable, in order to facilitate their role in helping interpret and communicate geographic problems. However no framework currently exists for ensuring this usability. The research presented here seeks to fill this gap, by addressing the challenges of incorporating user requirements in GVIS tool design. It starts from the premise that usability in GVIS should be incorporated and implemented throughout the whole design and development process. To facilitate this, Subject Technology Matching (STM) is proposed as a new approach to assessing and interpreting user requirements. Based on STM, a new design framework called Usability Enhanced Coordination Design (UECD) is ten presented with the purpose of leveraging overall usability of the design outputs. UECD places GVIS experts in a new key role in the design process, to form a more coordinated and integrated workflow and a more focused and interactive usability testing. To prove the concept, these theoretical elements of the framework have been implemented in two test projects: one is the creation of a coastal inundation simulation for Whitegate, Cork, Ireland; the other is a flooding mapping tool for Zhushan Town, Jiangsu, China. The two case studies successfully demonstrated the potential merits of the UECD approach when GVIS techniques are applied to geographic problem solving and decision making. The thesis delivers a comprehensive understanding of the development and challenges of GVIS technology, its usability concerns, usability and associated UCD; it explores the possibility of putting UCD framework in GVIS design; it constructs a new theoretical design framework called UECD which aims to make the whole design process usability driven; it develops the key concept of STM into a template set to improve the performance of a GVIS design. These key conceptual and procedural foundations can be built on future research, aimed at further refining and developing UECD as a useful design methodology for GVIS scholars and practitioners.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research investigates some of the reasons for the reported difficulties experienced by writers when using editing software designed for structured documents. The overall objective was to determine if there are aspects of the software interfaces which militate against optimal document construction by writers who are not computer experts, and to suggest possible remedies. Studies were undertaken to explore the nature and extent of the difficulties, and to identify which components of the software interfaces are involved. A model of a revised user interface was tested, and some possible adaptations to the interface are proposed which may help overcome the difficulties. The methodology comprised: 1. identification and description of the nature of a ‘structured document’ and what distinguishes it from other types of document used on computers; 2. isolation of the requirements of users of such documents, and the construction a set of personas which describe them; 3. evaluation of other work on the interaction between humans and computers, specifically in software for creating and editing structured documents; 4. estimation of the levels of adoption of the available software for editing structured documents and the reactions of existing users to it, with specific reference to difficulties encountered in using it; 5. examination of the software and identification of any mismatches between the expectations of users and the facilities provided by the software; 6. assessment of any physical or psychological factors in the reported difficulties experienced, and to determine what (if any) changes to the software might affect these. The conclusions are that seven of the twelve modifications tested could contribute to an improvement in usability, effectiveness, and efficiency when writing structured text (new document selection; adding new sections and new lists; identifying key information typographically; the creation of cross-references and bibliographic references; and the inclusion of parts of other documents). The remaining five were seen as more applicable to editing existing material than authoring new text (adding new elements; splitting and joining elements [before and after]; and moving block text).