42 resultados para Web sites -- Design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the results of a web-based perception study of the ranking of peer reviewed accounting journals by UK academics. The design of the survey instrument allows an interactive selection of journals to be scored. The webbased format is unique in that it also includes a step in which respondents classify the journals according to methodological perspective (paradigm). This is depicted graphically in the paper in a bubble diagram that shows the "positioning" of journals according to perceptions of both paradigm and quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern distributed control systems comprise of a set of processors which are interconnected using a suitable communication network. For use in real-time control environments, such systems must be deterministic and generate specified responses within critical timing constraints. Also, they should be sufficiently robust to survive predictable events such as communication or processor faults. This thesis considers the problem of coordinating and synchronizing a distributed real-time control system under normal and abnormal conditions. Distributed control systems need to periodically coordinate the actions of several autonomous sites. Often the type of coordination required is the all or nothing property of an atomic action. Atomic commit protocols have been used to achieve this atomicity in distributed database systems which are not subject to deadlines. This thesis addresses the problem of applying time constraints to atomic commit protocols so that decisions can be made within a deadline. A modified protocol is proposed which is suitable for real-time applications. The thesis also addresses the problem of ensuring that atomicity is provided even if processor or communication failures occur. Previous work has considered the design of atomic commit protocols for use in non time critical distributed database systems. However, in a distributed real-time control system a fault must not allow stringent timing constraints to be violated. This thesis proposes commit protocols using synchronous communications which can be made resilient to a single processor or communication failure and still satisfy deadlines. Previous formal models used to design commit protocols have had adequate state coverability but have omitted timing properties. They also assumed that sites communicated asynchronously and omitted the communications from the model. Timed Petri nets are used in this thesis to specify and design the proposed protocols which are analysed for consistency and timeliness. Also the communication system is mcxielled within the Petri net specifications so that communication failures can be included in the analysis. Analysis of the Timed Petri net and the associated reachability tree is used to show the proposed protocols always terminate consistently and satisfy timing constraints. Finally the applications of this work are described. Two different types of applications are considered, real-time databases and real-time control systems. It is shown that it may be advantageous to use synchronous communications in distributed database systems, especially if predictable response times are required. Emphasis is given to the application of the developed commit protocols to real-time control systems. Using the same analysis techniques as those used for the design of the protocols it can be shown that the overall system performs as expected both functionally and temporally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pneumonia caused by Pneumocystis carinii is ultimately responsible for the death of many acquired immunodeficiency syndrome (AIDS) patients. Large doses of trimethoprim and pyrimethamine in combination with a sulphonamide and/or pentamidine suppress the infection but produce serious side-effects and seldom prevent recurrence after treatment withdrawal. However, the partial success of the aforementioned antifolates, and also trimetrexate used alone, does suggest dihydrofolate reductase (DHFR) as a target for the development of antipneumocystis agents. From the DHFR inhibitory activities of 3'-substituted pyrimethamine analogues it was suggested that the 3'-(3'',3''-dimethyltriazen-1''-yl) substituent may be responsible for the greater activity for the P.carinii over the mammalian enzyme. Crystallographic and molecular modeling studies revealed considerable geometrical and electronic differences between the triazene and the chemically related formamidine functions that may account for the differences in DHFR inhibitory profiles. Structural and electronic parameters calculated for a series of 3'-(3'',3''-disubstitutedtriazen-1''-yl) pyrimethamine analogues did not correlate with the DHFR inhibitory activities. However, the in vitro screening against P.carinii DHFR revealed that the 3''-hydroxyethyl-3''-benzyl analogue was the most active and selective. Models of the active sites of human and P.carinii DHFRs were constructed using DHFR sequence and structural homology data which had identified key residues involved in substrate and cofactor binding. Low energy conformations of the 3'',3''-dimethyl and 3''-hydroxyethyl-3''-benzyle analogues, determined from nuclear magnetic resonance studies and theoretical calculations, were docked by superimposing the diaminopyrimidine fragment onto a previously docked pyrimethamine analogue. Enzyme kinetic data supported the 3''-hydroxyethyl-3''-benzyl moiety being located in the NADPH binding groove. The 3''-benzyl substituent was able to locate to within 3 AA of a valine residue in the active site of P.carinii DHFR thereby producing a hydrophobic contact. The equivalent residue in human DHFR is threonine, more hydrophilic and less likely to be involved in such a contact. This difference may account for the greater inhibitory activity this analogue has for P.carinii DHFR and provide a basis for future drug design. From an in vivo model of PCP in immunosuppressed rats it was established that the 3"-hydroxyethyl-3"-benzyl analogue was able to reduce the.P.carinii burden more effectively with increasing doses, without causmg any visible signs of toxicity. However, equivalent doses were not as effective as pentamidine, a current treatment of choice for Pneumocystis carinii pneumonia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is concerned with the behaviour of thin webbed rolled steel joists or universal beams when they are subjected to concentrated loads applied to the flanges. The prime concern is the effect of high direct stresses causing web failure in a small region of the beam. The review shows that although many tests have been carried out on rolled steel beams and built up girders, no series of tests has restricted the number of variables involved to enable firm conclusions to be drawn. The results of 100 tests on several different rolled steel universal beam sections having various types of loading conditions are presented. The majority of the beams are tested by loading with two opposite loads, thus eliminating the effects of bending and shear, except for a small number of beams which are tested simply supported on varying spans. The test results are first compared with the present design standard (BS 449) and it is shown that the British Standard is very conservative for most of the loading conditions included in the tests but is unsafe for others. Three possible failure modes are then considered, overall elastic buckling of the web, flexural yielding of the web due to large out of plane deflexions and local crushing of the material at the junction of the web and the root fillets. Each mode is considered theoretically and developed to establish the main variables, thus enabling a comparison to be made with the test results. It is shown that all three failure modes have a particular relevance for individual loading conditions, but that determining the failure load given the beam size and the loading conditions is very difficult in certain instances. Finally it is shown that there are some empirical relationships between the failure loads and the type of loading for various beam serial sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is concerned with quality and productivity aspects of traditional house building. The research focuses on these issues by concentrating on the services and finishing stages of the building process. These are work stages which have not been fully investigated in previous productivity related studies. The primary objective of the research is to promote an integrated design and construction led approach to traditional house building based on an original concept of 'development cycles'. This process involves the following: site monitoring; the analysis of work operations; implementing design and construction changes founded on unique information collected during site monitoring; and subsequent re-monitoring to measure and assess Ihe effect of change. A volume house building firm has been involved in this applied research and has allowed access to its sites for production monitoring purposes. The firm also assisted in design detailing for a small group of 'experimental' production houses where various design and construction changes were implemented. Results from the collaborative research have shown certain quality and productivity improvements to be possible using this approach, albeit on a limited scale at this early experimental stage. The improvements have been possible because an improved activity sampling technique, developed for, and employed by the study, has been able to describe why many quality and productivity related problems occur during site building work. Experience derived from the research has shown the following attributes to be important: positive attitudes towards innovation; effective communication; careful planning and organisation; and good coordination and control at site level. These are all essential aspects of quality led management and determine to a large extent the overall success of this approach. Future work recommendations must include a more widespread use of innovative practices so that further design and construction modifications can be made. By doing this, productivity can be improved, cost savings made and better quality afforded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis explores how the world-wide-web can be used to support English language teachers doing further studies at a distance. The future of education worldwide is moving towards a requirement that we, as teacher educators, use the latest web technology not as a gambit, but as a viable tool to improve learning. By examining the literature on knowledge, teacher education and web training, a model of teacher knowledge development, along with statements of advice for web developers based upon the model are developed. Next, the applicability and viability of both the model and statements of advice are examined by developing a teacher support site (bttp://www. philseflsupport. com) according to these principles. The data collected from one focus group of users from sixteen different countries, all studying on the same distance Masters programme, is then analysed in depth. The outcomes from the research are threefold: A functioning website that is averaging around 15, 000 hits a month provides a professional contribution. An expanded model of teacher knowledge development that is based upon five theoretical principles that reflect the ever-expanding cyclical nature of teacher learning provides an academic contribution. A series of six statements of advice for developers of teacher support sites. These statements are grounded in the theoretical principles behind the model of teacher knowledge development and incorporate nine keys to effective web facilitation. Taken together, they provide a forward-looking contribution to the praxis of web supported teacher education, and thus to the potential dissemination of the research presented here. The research has succeeded in reducing the proliferation of terminology in teacher knowledge into a succinct model of teacher knowledge development. The model may now be used to further our understanding of how teachers learn and develop as other research builds upon the individual study here. NB: Appendix 4 is only available only available for consultation at Aston University Library with prior arrangement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purine and pyrimidine triplex-forming oligonucleotides (TFOs), as potential antibacterial agents, were designed to bind by Hoogsteen and reverse Hoogsteen hydrogen bonds in a sequence specific manner in the major groove of genomic DNA at specific polypurine sites within the gyrA gene of E. coli and S. pneumoniae. Sequences were prepared by automated synthesis, with purification and characterisation determined by high performance liquid chromatograpy, capillary electrophoresis and mass spectrometry. Triplex stability was assessed using melting curves where the binding of the third strand to the duplex target, was assessed over a temperature range of 0-80°C, and at pH 6.4 and 7.2. The most successful of the unmodified TFOs (6) showed a Tm value of 26 °C at both pH values with binding via reverse Hoogsteen bonds. Binding to genomic DNA was also demonstrated by spectrofluorimetry, using fluorescein-labelled TFOs, from which dissociation constants were determined. Modifications in the form of 5mC, 5' acridine attachment, phosphorothioation, 2'-0-methylation and phosphoramidation, were made in order to. increase Tm values. Phosphoramidate modification was the most with increased Tm values of 42°C. However, the final purity of these sequences was poor due to their difficult syntheses. FACS (fluorescent activated cell sorting) analysis was used to determine the potential uptake of a fluorescently labelled analogue of 6 via passive, coJd shock mediated, and anionic liposome aided, uptake. This was established at 20°C and 37°C. At both temperatures anionic lipid-mediated uptake produced unrivalled fluorescence, equivalent to 20 and 43% at 20 and 37°C respectively. Antibacterial activity of each oligonucleotide was assessed by viable count anaJysis relying on passive uptake, cold shocking techniques, chlorpromazine-mediated uptake, and, cationic and anionic lipid-aided uptake. All oligonucleotides were assessed for their ability to enhance uptake, which is a major barrier to the effectiveness of these agents. Compound 6 under cold shocking conditions produced the greatest consistent decline in colony forming units per ml. Results for this compound were sometimes variable indicating inconsistent uptake by this particular assay method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: The objective of this research was to design a clinical decision support system (CDSS) that supports heterogeneous clinical decision problems and runs on multiple computing platforms. Meeting this objective required a novel design to create an extendable and easy to maintain clinical CDSS for point of care support. The proposed solution was evaluated in a proof of concept implementation. METHODS: Based on our earlier research with the design of a mobile CDSS for emergency triage we used ontology-driven design to represent essential components of a CDSS. Models of clinical decision problems were derived from the ontology and they were processed into executable applications during runtime. This allowed scaling applications' functionality to the capabilities of computing platforms. A prototype of the system was implemented using the extended client-server architecture and Web services to distribute the functions of the system and to make it operational in limited connectivity conditions. RESULTS: The proposed design provided a common framework that facilitated development of diversified clinical applications running seamlessly on a variety of computing platforms. It was prototyped for two clinical decision problems and settings (triage of acute pain in the emergency department and postoperative management of radical prostatectomy on the hospital ward) and implemented on two computing platforms-desktop and handheld computers. CONCLUSIONS: The requirement of the CDSS heterogeneity was satisfied with ontology-driven design. Processing of application models described with the help of ontological models allowed having a complex system running on multiple computing platforms with different capabilities. Finally, separation of models and runtime components contributed to improved extensibility and maintainability of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the resistance literature as an underpinning theoretical framework, this chapter analyzes how Web designers through their daily practices, (i) adopt recursive, adaptive, and resisting behavior regarding the inclusion of social cues online and (ii) shape the socio-technical power relationship between designers and other stakeholders. Five vignettes in the form of case studies with expert individual Web designers are used. Findings point out at three types of emerging resistance namely: market driven resistance, ideological resistance, and functional resistance. In addition, a series of propositions are provided linking the various themes. Furthermore, the authors suggest that stratification in Web designers’ type is occurring and that resistance offers a novel lens to analyze the debate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The calcitonin receptor-like receptor (CLR) acts as a receptor for the calcitonin gene-related peptide (CGRP) but in order to recognize CGRP, it must form a complex with an accessory protein, receptor activity modifying protein 1 (RAMP1). Identifying the protein/protein and protein/ligand interfaces in this unusual complex would aid drug design. The role of the extreme N-terminus of CLR (Glu23-Ala60) was examined by an alanine scan and the results were interpreted with the help of a molecular model. The potency of CGRP at stimulating cAMP production was reduced at Leu41Ala, Gln45Ala, Cys48Ala and Tyr49Ala; furthermore, CGRP-induced receptor internalization at all of these receptors was also impaired. Ile32Ala, Gly35Ala and Thr37Ala all increased CGRP potency. CGRP specific binding was abolished at Leu41Ala, Ala44Leu, Cys48Ala and Tyr49Ala. There was significant impairment of cell surface expression of Gln45Ala, Cys48Ala and Tyr49Ala. Cys48 takes part in a highly conserved disulfide bond and is probably needed for correct folding of CLR. The model suggests that Gln45 and Tyr49 mediate their effects by interacting with RAMP1 whereas Leu41 and Ala44 are likely to be involved in binding CGRP. Ile32, Gly35 and Thr37 form a separate cluster of residues which modulate CGRP binding. The results from this study may be applicable to other family B GPCRs which can associate with RAMPs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monitoring land-cover changes on sites of conservation importance allows environmental problems to be detected, solutions to be developed and the effectiveness of actions to be assessed. However, the remoteness of many sites or a lack of resources means these data are frequently not available. Remote sensing may provide a solution, but large-scale mapping and change detection may not be appropriate, necessitating site-level assessments. These need to be easy to undertake, rapid and cheap. We present an example of a Web-based solution based on free and open-source software and standards (including PostGIS, OpenLayers, Web Map Services, Web Feature Services and GeoServer) to support assessments of land-cover change (and validation of global land-cover maps). Authorised users are provided with means to assess land-cover visually and may optionally provide uncertainty information at various levels: from a general rating of their confidence in an assessment to a quantification of the proportions of land-cover types within a reference area. Versions of this tool have been developed for the TREES-3 initiative (Simonetti, Beuchle and Eva, 2011). This monitors tropical land-cover change through ground-truthing at latitude / longitude degree confluence points, and for monitoring of change within and around Important Bird Areas (IBAs) by Birdlife International and the Royal Society for the Protection of Birds (RSPB). In this paper we present results from the second of these applications. We also present further details on the potential use of the land-cover change assessment tool on sites of recognised conservation importance, in combination with NDVI and other time series data from the eStation (a system for receiving, processing and disseminating environmental data). We show how the tool can be used to increase the usability of earth observation data by local stakeholders and experts, and assist in evaluating the impact of protection regimes on land-cover change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Semantic Web relies on carefully structured, well defined, data to allow machines to communicate and understand one another. In many domains (e.g. geospatial) the data being described contains some uncertainty, often due to incomplete knowledge; meaningful processing of this data requires these uncertainties to be carefully analysed and integrated into the process chain. Currently, within the SemanticWeb there is no standard mechanism for interoperable description and exchange of uncertain information, which renders the automated processing of such information implausible, particularly where error must be considered and captured as it propagates through a processing sequence. In particular we adopt a Bayesian perspective and focus on the case where the inputs / outputs are naturally treated as random variables. This paper discusses a solution to the problem in the form of the Uncertainty Markup Language (UncertML). UncertML is a conceptual model, realised as an XML schema, that allows uncertainty to be quantified in a variety of ways i.e. realisations, statistics and probability distributions. UncertML is based upon a soft-typed XML schema design that provides a generic framework from which any statistic or distribution may be created. Making extensive use of Geography Markup Language (GML) dictionaries, UncertML provides a collection of definitions for common uncertainty types. Containing both written descriptions and mathematical functions, encoded as MathML, the definitions within these dictionaries provide a robust mechanism for defining any statistic or distribution and can be easily extended. Universal Resource Identifiers (URIs) are used to introduce semantics to the soft-typed elements by linking to these dictionary definitions. The INTAMAP (INTeroperability and Automated MAPping) project provides a use case for UncertML. This paper demonstrates how observation errors can be quantified using UncertML and wrapped within an Observations & Measurements (O&M) Observation. The interpolation service uses the information within these observations to influence the prediction outcome. The output uncertainties may be encoded in a variety of UncertML types, e.g. a series of marginal Gaussian distributions, a set of statistics, such as the first three marginal moments, or a set of realisations from a Monte Carlo treatment. Quantifying and propagating uncertainty in this way allows such interpolation results to be consumed by other services. This could form part of a risk management chain or a decision support system, and ultimately paves the way for complex data processing chains in the Semantic Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article characterizes key weaknesses in the ability of current digital libraries to support scholarly inquiry, and as a way to address these, proposes computational services grounded in semiformal models of the naturalistic argumentation commonly found in research literatures. It is argued that a design priority is to balance formal expressiveness with usability, making it critical to coevolve the modeling scheme with appropriate user interfaces for argument construction and analysis. We specify the requirements for an argument modeling scheme for use by untrained researchers and describe the resulting ontology, contrasting it with other domain modeling and semantic web approaches, before discussing passive and intelligent user interfaces designed to support analysts in the construction, navigation, and analysis of scholarly argument structures in a Web-based environment. © 2007 Wiley Periodicals, Inc. Int J Int Syst 22: 17–47, 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Semantic Web (SW) offers an opportunity to develop novel, sophisticated forms of question answering (QA). Specifically, the availability of distributed semantic markup on a large scale opens the way to QA systems which can make use of such semantic information to provide precise, formally derived answers to questions. At the same time the distributed, heterogeneous, large-scale nature of the semantic information introduces significant challenges. In this paper we describe the design of a QA system, PowerAqua, designed to exploit semantic markup on the web to provide answers to questions posed in natural language. PowerAqua does not assume that the user has any prior information about the semantic resources. The system takes as input a natural language query, translates it into a set of logical queries, which are then answered by consulting and aggregating information derived from multiple heterogeneous semantic sources.