119 resultados para Distribuito, Modello, Attori, Software, quality, Scalabilità
Resumo:
With the proliferation of relational database programs for PC's and other platforms, many business end-users are creating, maintaining, and querying their own databases. More importantly, business end-users use the output of these queries as the basis for operational, tactical, and strategic decisions. Inaccurate data reduce the expected quality of these decisions. Implementing various input validation controls, including higher levels of normalisation, can reduce the number of data anomalies entering the databases. Even in well-maintained databases, however, data anomalies will still accumulate. To improve the quality of data, databases can be queried periodically to locate and correct anomalies. This paper reports the results of two experiments that investigated the effects of different data structures on business end-users' abilities to detect data anomalies in a relational database. The results demonstrate that both unnormalised and higher levels of normalisation lower the effectiveness and efficiency of queries relative to the first normal form. First normal form databases appear to provide the most effective and efficient data structure for business end-users formulating queries to detect data anomalies.
Resumo:
Purpose : Despite increased interest in quality of life (QOL) as an outcome measure and as a means of identifying client needs in health care, its conceptualisation and the identification of its constituents have been poorly researched for elderly people with stroke in Hong Kong. Method: This article analysed the literature to identify components relevant to the QOL of Chinese elderly people with stroke living in the community in Hong Kong. Results : While common components of QOL for elderly people with and without stroke and regardless of cultural background were identified, a number were specific to an elderly Chinese stroke population. Conclusion : Recommendations for future research have been made with reference to further exploring and validating these components for the target population. A clear understanding of these aspects is essential for the development of sensitive QOL measures to guide and evaluate service delivery to this population.
Resumo:
Computer assisted learning has an important role in the teaching of pharmacokinetics to health sciences students because it transfers the emphasis from the purely mathematical domain to an 'experiential' domain in which graphical and symbolic representations of actions and their consequences form the major focus for learning. Basic pharmacokinetic concepts can be taught by experimenting with the interplay between dose and dosage interval with drug absorption (e.g. absorption rate, bioavailability), drug distribution (e.g. volume of distribution, protein binding) and drug elimination (e.g. clearance) on drug concentrations using library ('canned') pharmacokinetic models. Such 'what if' approaches are found in calculator-simulators such as PharmaCalc, Practical Pharmacokinetics and PK Solutions. Others such as SAAM II, ModelMaker, and Stella represent the 'systems dynamics' genre, which requires the user to conceptualise a problem and formulate the model on-screen using symbols, icons, and directional arrows. The choice of software should be determined by the aims of the subject/course, the experience and background of the students in pharmacokinetics, and institutional factors including price and networking capabilities of the package(s). Enhanced learning may result if the computer teaching of pharmacokinetics is supported by tutorials, especially where the techniques are applied to solving problems in which the link with healthcare practices is clearly established.
Resumo:
Incremental parsing has long been recognized as a technique of great utility in the construction of language-based editors, and correspondingly, the area currently enjoys a mature theory. Unfortunately, many practical considerations have been largely overlooked in previously published algorithms. Many user requirements for an editing system necessarily impact on the design of its incremental parser, but most approaches focus only on one: response time. This paper details an incremental parser based on LR parsing techniques and designed for use in a modeless syntax recognition editor. The nature of this editor places significant demands on the structure and quality of the document representation it uses, and hence, on the parser. The strategy presented here is novel in that both the parser and the representation it constructs are tolerant of the inevitable and frequent syntax errors that arise during editing. This is achieved by a method that differs from conventional error repair techniques, and that is more appropriate for use in an interactive context. Furthermore, the parser aims to minimize disturbance to this representation, not only to ensure other system components can operate incrementally, but also to avoid unfortunate consequences for certain user-oriented services. The algorithm is augmented with a limited form of predictive tree-building, and a technique is presented for the determination of valid symbols for menu-based insertion. Copyright (C) 2001 John Wiley & Sons, Ltd.
Resumo:
Service quality is assessed by customers along the dimensions of staff conduct, credibility, communication, and access to teller services. Credibility and staff conduct emerge as the highest loading first-order factors. This highlights the significance of rectifying mistakes while keeping customers informed, and employing branch staff that are responsive and civilized in their conduct. Discovery of a valid second-order factor, namely, overall customer service quality, underscores the importance of providing quality service across all its dimensions. For example, if the bank fails to rectify mistakes and keep customers informed but excels in all other dimensions, its overall customer service quality can still be rated poorly.
Resumo:
Despite evidence linking shrimp farming to several cases of environmental degradation, there remains a lack of ecologically meaningful information about the impacts of effluent on receiving waters. The aim of this study was to determine the biological impact of shrimp farm effluent, and to compare and distinguish its impacts from treated sewage effluent. Analyses included standard water quality/sediment parameters, as well as biological indicators including tissue nitrogen (N) content, stable isotope ratio of nitrogen (delta N-15) and amino acid composition of inhabitant seagrasses, mangroves and macroalgae. The study area consisted of two tidal creeks, one receiving effluent from a sewage treatment plant and the other from an intensive shrimp farm. The creeks discharged into the western side of Moreton Bay, a sub-tropical coastal embayment on the east coast of Australia. Characterization of water quality revealed significant differences between the creeks, and with unimpacted eastern Moreton Bay. The sewage creek had higher concentrations of dissolved nutrients (predominantly NO3-/NO2- and PO43-, compared to NH4+ in the shrimp creek). In contrast, the shrimp creek was more turbid and had higher phytoplankton productivity. Beyond 750 m from the creek mouths, water quality parameters were indistinguishable from eastern Moreton Bay values. Biological indicators detected significant impacts up to 4 km beyond the creek mouths (reference site). Elevated plant delta N-15 values ranged from 10.4-19.6 parts per thousand at the site of sewage discharge to 2.9-4.5 parts per thousand at the reference site. The free amino acid concentration and composition of seagrass and macroalgae was used to distinguish between the uptake of sewage and shrimp derived N. Proline (seagrass) and serine (macroalgae) were high in sewage impacted plants and glutamine (seagrass) and alanine (macroalgae) were high in plants impacted by shrimp effluent. The delta N-15 isotopic signatures and free amino acid composition of inhabitant flora indicated that sewage N extended further from the creek mouths than shrimp N. The combination of physical/chemical and biological indicators used in this study was effective in distinguishing the composition and subsequent impacts of aquaculture and sewage effluent on the receiving waters. (C) 2001 Academic Press.
Resumo:
The potential for the ethylene binding inhibitor, 1-methylcyclopropene, to delay ripening of 'Hass' avocado, 'African Pride' custard apple, 'Kensington Pride' mango and 'Solo' papaya was examined. Fruit were gassed with 25 muL/L 1-methylcyclopropene for 14 h at 20 degreesC, followed by treatment with 100 muL/L ethylene for 24 h, and then ripened at 20 degreesC. Ethylene treatment alone generally halved the number of days for fruit to reach the ripe stage, compared with untreated fruit. 1-Methylcyclopropene treatment alone increased the number of days to ripening by 4.4 days (40% increase), 3.4 days (58%), 5.1 days (37%) and 15.6 days (325%) for avocado, custard apple, mango and papaya, respectively, compared with untreated fruit. Applying 1-methylcyclopropene to the fruit before ethylene prevented the accelerated ripening normally associated with ethylene treatment, so that the number of days to ripening for fruit treated with 1-methylcyclopropene plus ethylene was similar to the number of days to ripening for fruit treated with 1-methylcyclopropene alone. 1-Methylcyclopropene treatment was associated with slightly higher severity of external blemishes in papaya and custard apple, slightly higher rots severity in avocado, custard apple and papaya, and at least double the severity of stem rots in mango, relative to fruit not treated with 1-methylcyclopropene. Thus, 1-methylcyclopropene treatment has the potential to reduce the risk of premature ripening of avocado, custard apple, mango and papaya fruit due to accidental exposure to ethylene. However, additional precautions may be necessary to reduce disease severity associated with 1-methylcyclopropene treatment.
Resumo:
Institutional research can be defined as "the activity in which the research effort of an academic institution is directed at the solution of its own problems and to the enhancement of its own performance" (Woodward, 1993, p. 113). This paper describes and reflects on an attempt at the University of Queensland to address the need for course quality appraisal for improvement. The strategy, Continuous Curriculum Review (CCR) is simply an attempt to trial and promote regular comprehensive data collection for developing 'snapshot' views of whole curricula so that decisions about what to change and what to change first can be made in an empirically defensible and timely manner. The strategy and reporting protocols that were developed are described, and the costs and benefits of engaging in this kind of data gathering exercise for quality assurance and quality enhancement purposes are discussed.