890 resultados para Web sites-design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Studies have examined Internet use as a source of information by various populations, however no study has examined the quality and accessibility of websites for people with aphasia, or their use of such sites. Aims: This study aimed to describe the quality, communicative accessibility, and readability of a sample of aphasia websites and to determine whether sites preferred by people with aphasia were those rated highly on measures of accessibility and quality. The perceptions of people with aphasia regarding the accessibility of the sites were compared with those of speech pathologists. The relationship between the quality and communicative accessibility of websites was analysed. Factors that may influence use of the Internet by people with aphasia and speech pathologists were explored. Methods & Procedures: Tools for measuring quality and communicative accessibility were developed and a sample of five websites was selected. Two participant groups (18 speech pathologists and 6 people with aphasia) assessed aphasia websites in terms of communicative accessibility. Speech pathologists also rated website quality. Spearman's rho was used to determine levels of agreement between variables. Outcomes & Results: People with aphasia and speech pathologists showed minimal agreement on their perceptions of communicative accessibility. However, when the preferences of websites (Aphasia Help and Speakability) were of a very high standard. There was a weak relationship between quality and communicative accessibility, however it was not statistically significant. Conclusions: Accessible websites are not necessarily of high quality, and quality websites are not guaranteed to be easily accessible. People with aphasia did not agree with speech pathologists as to what makes a good aphasia website. Therefore, people with aphasia should be involved in the design of aphasia websites since they are often the intended users. If Internet use by people with aphasia increases in line with other health populations, speech pathologist need to have the skills and confidence to recommend appropriate sites to their clients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Socioeconomic considerations should have an important place in reserve design, Systematic reserve-selection tools allow simultaneous optimization for ecological objectives while minimizing costs but are seldom used to incorporate socioeconomic costs in the reserve-design process. The sensitivity of this process to biodiversity data resolution has been studied widely but the issue of socioeconomic data resolution has not previously been considered. We therefore designed marine reserves for biodiversity conservation with the constraint of minimizing commercial fishing revenue losses and investigated how economic data resolution affected the results. Incorporating coarse-resolution economic data from official statistics generated reserves that were only marginally less costly to the fishery than those designed with no attempt to minimize economic impacts. An intensive survey yielded fine-resolution data that, when incorporated in the design process, substantially reduced predicted fishery losses. Such an approach could help minimize fisher displacement because the least profitable grounds are selected for the reserve. Other work has shown that low-resolution biodiversity data can lead to underestimation of the conservation value of some sites, and a risk of overlooking the most valuable areas, and we have similarly shown that low-resolution economic data can cause underestimation of the profitability of some sites and a risk of inadvertently including these in the reserve. Detailed socioeconomic data are therefore an essential input for the design of cost-effective reserve networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Designing novel proteins with site-directed recombination has enormous prospects. By locating effective recombination sites for swapping sequence parts, the probability that hybrid sequences have the desired properties is increased dramatically. The prohibitive requirements for applying current tools led us to investigate machine learning to assist in finding useful recombination sites from amino acid sequence alone. Results: We present STAR, Site Targeted Amino acid Recombination predictor, which produces a score indicating the structural disruption caused by recombination, for each position in an amino acid sequence. Example predictions contrasted with those of alternative tools, illustrate STAR'S utility to assist in determining useful recombination sites. Overall, the correlation coefficient between the output of the experimentally validated protein design algorithm SCHEMA and the prediction of STAR is very high (0.89). Conclusion: STAR allows the user to explore useful recombination sites in amino acid sequences with unknown structure and unknown evolutionary origin. The predictor service is available from http://pprowler.itee.uq.edu.au/star.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A location-based search engine must be able to find and assign proper locations to Web resources. Host, content and metadata location information are not sufficient to describe the location of resources as they are ambiguous or unavailable for many documents. We introduce target location as the location of users of Web resources. Target location is content-independent and can be applied to all types of Web resources. A novel method is introduced which uses log files and IN to track the visitors of websites. The experiments show that target location can be calculated for almost all documents on the Web at country level and to the majority of them in state and city levels. It can be assigned to Web resources as a new definition and dimension of location. It can be used separately or with other relevant locations to define the geography of Web resources. This compensates insufficient geographical information on Web resources and would facilitate the design and development of location-based search engines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impact and use of information and communication technology on learning outcomes for accounting students is not well understood. This study investigates the impact of design features of Blackboard 1 used as aWeb-based Learning Environment (WBLE) in teaching undergraduate accounting students. Specifically, this investigation reports on a number of Blackboard design features (e.g. delivery of lecture notes, announcements, online assessment and model answers) used to deliver learning materials regarded as necessary to enhance learning outcomes. Responses from 369 on-campus students provided data to develop a regression model that seeks to explain enhanced participation and mental effort. The final regression shows that student satisfaction with the use of a WBLE is associated with five design features or variables. These include usefulness and availability of lecture notes, online assessment, model answers, and online chat.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

E-atmospherics have motivated an emerging body of research which reports that both virtual layouts and atmospherics encourage consumers to modify their shopping habits. While the literature has analyzed mainly the functional aspect of e-atmospherics, little has been done in terms of linking its characteristics’ to social (co-) creation. This paper focuses on the anatomy of social dimension in relation to e-atmospherics, which includes factors such as the aesthetic design of space, the influence of visual cues, interpretation of shopping as a social activity and meaning of appropriate interactivity. We argue that web designers are social agents who interact within intangible social reference sets, restricted by social standards, value, beliefs, status and duties embedded within their local geographies. We aim to review the current understanding of the importance and voluntary integration of social cues displayed by web designers from a mature market and an emerging market, and provides an analysis based recommendation towards the development of an integrated e-social atmospheric framework. Results report the findings from telephone interviews with an exploratory set of 10 web designers in each country. This allows us to re-interpret the web designers’ reality regarding social E-atmospherics. We contend that by comprehending (before any consumer input) social capital, daily micro practices, habits and routine, deeper understanding of social e-atmospherics preparatory, initial stages and expected functions will be acquired.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Convergence of technologies in the Internet and the field of expert systems have offered new ways of sharing and distributing knowledge. However, there has been a general lack of research in the area of web-based expert systems (ES). This paper addresses the issues associated with the design, development, and use of web-based ES from a standpoint of the benefits and challenges of developing and using them. The original theory and concepts in conventional ES were reviewed and a knowledge engineering framework for developing them was revisited. The study considered three web-based ES: WITS-advisor - for e-business strategy development, Fish-Expert - for fish disease diagnosis, and IMIS - to promote intelligent interviews. The benefits and challenges in developing and using ES are discussed by comparing them with traditional standalone systems from development and application perspectives. © 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the results of a web-based perception study of the ranking of peer reviewed accounting journals by UK academics. The design of the survey instrument allows an interactive selection of journals to be scored. The webbased format is unique in that it also includes a step in which respondents classify the journals according to methodological perspective (paradigm). This is depicted graphically in the paper in a bubble diagram that shows the "positioning" of journals according to perceptions of both paradigm and quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern distributed control systems comprise of a set of processors which are interconnected using a suitable communication network. For use in real-time control environments, such systems must be deterministic and generate specified responses within critical timing constraints. Also, they should be sufficiently robust to survive predictable events such as communication or processor faults. This thesis considers the problem of coordinating and synchronizing a distributed real-time control system under normal and abnormal conditions. Distributed control systems need to periodically coordinate the actions of several autonomous sites. Often the type of coordination required is the all or nothing property of an atomic action. Atomic commit protocols have been used to achieve this atomicity in distributed database systems which are not subject to deadlines. This thesis addresses the problem of applying time constraints to atomic commit protocols so that decisions can be made within a deadline. A modified protocol is proposed which is suitable for real-time applications. The thesis also addresses the problem of ensuring that atomicity is provided even if processor or communication failures occur. Previous work has considered the design of atomic commit protocols for use in non time critical distributed database systems. However, in a distributed real-time control system a fault must not allow stringent timing constraints to be violated. This thesis proposes commit protocols using synchronous communications which can be made resilient to a single processor or communication failure and still satisfy deadlines. Previous formal models used to design commit protocols have had adequate state coverability but have omitted timing properties. They also assumed that sites communicated asynchronously and omitted the communications from the model. Timed Petri nets are used in this thesis to specify and design the proposed protocols which are analysed for consistency and timeliness. Also the communication system is mcxielled within the Petri net specifications so that communication failures can be included in the analysis. Analysis of the Timed Petri net and the associated reachability tree is used to show the proposed protocols always terminate consistently and satisfy timing constraints. Finally the applications of this work are described. Two different types of applications are considered, real-time databases and real-time control systems. It is shown that it may be advantageous to use synchronous communications in distributed database systems, especially if predictable response times are required. Emphasis is given to the application of the developed commit protocols to real-time control systems. Using the same analysis techniques as those used for the design of the protocols it can be shown that the overall system performs as expected both functionally and temporally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pneumonia caused by Pneumocystis carinii is ultimately responsible for the death of many acquired immunodeficiency syndrome (AIDS) patients. Large doses of trimethoprim and pyrimethamine in combination with a sulphonamide and/or pentamidine suppress the infection but produce serious side-effects and seldom prevent recurrence after treatment withdrawal. However, the partial success of the aforementioned antifolates, and also trimetrexate used alone, does suggest dihydrofolate reductase (DHFR) as a target for the development of antipneumocystis agents. From the DHFR inhibitory activities of 3'-substituted pyrimethamine analogues it was suggested that the 3'-(3'',3''-dimethyltriazen-1''-yl) substituent may be responsible for the greater activity for the P.carinii over the mammalian enzyme. Crystallographic and molecular modeling studies revealed considerable geometrical and electronic differences between the triazene and the chemically related formamidine functions that may account for the differences in DHFR inhibitory profiles. Structural and electronic parameters calculated for a series of 3'-(3'',3''-disubstitutedtriazen-1''-yl) pyrimethamine analogues did not correlate with the DHFR inhibitory activities. However, the in vitro screening against P.carinii DHFR revealed that the 3''-hydroxyethyl-3''-benzyl analogue was the most active and selective. Models of the active sites of human and P.carinii DHFRs were constructed using DHFR sequence and structural homology data which had identified key residues involved in substrate and cofactor binding. Low energy conformations of the 3'',3''-dimethyl and 3''-hydroxyethyl-3''-benzyle analogues, determined from nuclear magnetic resonance studies and theoretical calculations, were docked by superimposing the diaminopyrimidine fragment onto a previously docked pyrimethamine analogue. Enzyme kinetic data supported the 3''-hydroxyethyl-3''-benzyl moiety being located in the NADPH binding groove. The 3''-benzyl substituent was able to locate to within 3 AA of a valine residue in the active site of P.carinii DHFR thereby producing a hydrophobic contact. The equivalent residue in human DHFR is threonine, more hydrophilic and less likely to be involved in such a contact. This difference may account for the greater inhibitory activity this analogue has for P.carinii DHFR and provide a basis for future drug design. From an in vivo model of PCP in immunosuppressed rats it was established that the 3"-hydroxyethyl-3"-benzyl analogue was able to reduce the.P.carinii burden more effectively with increasing doses, without causmg any visible signs of toxicity. However, equivalent doses were not as effective as pentamidine, a current treatment of choice for Pneumocystis carinii pneumonia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is concerned with the behaviour of thin webbed rolled steel joists or universal beams when they are subjected to concentrated loads applied to the flanges. The prime concern is the effect of high direct stresses causing web failure in a small region of the beam. The review shows that although many tests have been carried out on rolled steel beams and built up girders, no series of tests has restricted the number of variables involved to enable firm conclusions to be drawn. The results of 100 tests on several different rolled steel universal beam sections having various types of loading conditions are presented. The majority of the beams are tested by loading with two opposite loads, thus eliminating the effects of bending and shear, except for a small number of beams which are tested simply supported on varying spans. The test results are first compared with the present design standard (BS 449) and it is shown that the British Standard is very conservative for most of the loading conditions included in the tests but is unsafe for others. Three possible failure modes are then considered, overall elastic buckling of the web, flexural yielding of the web due to large out of plane deflexions and local crushing of the material at the junction of the web and the root fillets. Each mode is considered theoretically and developed to establish the main variables, thus enabling a comparison to be made with the test results. It is shown that all three failure modes have a particular relevance for individual loading conditions, but that determining the failure load given the beam size and the loading conditions is very difficult in certain instances. Finally it is shown that there are some empirical relationships between the failure loads and the type of loading for various beam serial sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is concerned with quality and productivity aspects of traditional house building. The research focuses on these issues by concentrating on the services and finishing stages of the building process. These are work stages which have not been fully investigated in previous productivity related studies. The primary objective of the research is to promote an integrated design and construction led approach to traditional house building based on an original concept of 'development cycles'. This process involves the following: site monitoring; the analysis of work operations; implementing design and construction changes founded on unique information collected during site monitoring; and subsequent re-monitoring to measure and assess Ihe effect of change. A volume house building firm has been involved in this applied research and has allowed access to its sites for production monitoring purposes. The firm also assisted in design detailing for a small group of 'experimental' production houses where various design and construction changes were implemented. Results from the collaborative research have shown certain quality and productivity improvements to be possible using this approach, albeit on a limited scale at this early experimental stage. The improvements have been possible because an improved activity sampling technique, developed for, and employed by the study, has been able to describe why many quality and productivity related problems occur during site building work. Experience derived from the research has shown the following attributes to be important: positive attitudes towards innovation; effective communication; careful planning and organisation; and good coordination and control at site level. These are all essential aspects of quality led management and determine to a large extent the overall success of this approach. Future work recommendations must include a more widespread use of innovative practices so that further design and construction modifications can be made. By doing this, productivity can be improved, cost savings made and better quality afforded.