882 resultados para COMPREHENSIVE SET
Resumo:
The number of mammalian transcripts identified by full-length cDNA projects and genome sequencing projects is increasing remarkably. Clustering them into a strictly nonredundant and comprehensive set provides a platform for functional analysis of the transcriptome and proteome, but the quality of the clustering and predictive usefulness have previously required manual curation to identify truncated transcripts and inappropriate clustering of closely related sequences. A Representative Transcript and Protein Sets (RTPS) pipeline was previously designed to identify the nonredundant and comprehensive set of mouse transcripts based on clustering of a large mouse full-length cDNA set (FANTOM2). Here we propose an alternative method that is more robust, requires less manual curation, and is applicable to other organisms in addition to mouse. RTPSs of human, mouse, and rat have been produced by this method and used for validation. Their comprehensiveness and quality are discussed by comparison with other clustering approaches. The RTPSs are available at ftp://fantom2.gsc.riken.go.jp/RTPS/. (C). 2004 Elsevier Inc. All rights reserved.
Resumo:
Employee turnover is giving sleepless nights to HR managers in many countries in Asia. A widely-held belief in these countries is that employees have developed 'bad' attitudes due to the labour shortage. Employees are believed to job-hop for no reason, or even for fun. Unfortunately, despite employee turnover being such a serious problem in Asia, there is a dearth of studies investigating it; in particular studies using a comprehensive set of variables are rare. This study examines three sets of antecedents of turnover intention in companies in Singapore: demographic, controllable and uncontrollable. Singapore companies provide an appropriate setting as their turnover rates are among the highest in Asia. Findings of the study suggest that organisational commitment, procedural justice and a job-hopping attitude were three main factors associated with turnover intention in Singapore companies.
Resumo:
Shropshire Energy Team initiated this study to examine consumption and associated emissions in the predominantly rural county of Shropshire. Current use of energy is not sustainable in the long term and there are various approaches to dealing with the environmental problems it creates. Energy planning by a local authority for a sustainable future requires detailed energy consumption and environmental information. This information would enable target setting and the implementation of policies designed to encourage energy efficiency improvements and exploitation of renewable energy resources. This could aid regeneration strategies by providing new employment opportunities. Associated reductions in carbon dioxide and other emissions would help to meet national and international environmental targets. In the absence of this detailed information, the objective was to develop a methodology to assess energy consumption and emissions on a regional basis from 1990 onwards for all local planning authorities. This would enable a more accurate assessment of the relevant issues, such that plans are more appropriate and longer lasting. A first comprehensive set of data has been gathered from a wide range of sources and a strong correlation was found between population and energy consumption for a variety of regions across the UK. In this case the methodology was applied to the county of Shropshire to give, for the first time, estimates of primary fuel consumption, electricity consumption and associated emissions in Shropshire for 1990 to 2025. The estimates provide a suitable baseline for assessing the potential contribution renewable energy could play in meeting electricity demand in the country and in reducing emissions. The assessment indicated that in 1990 total primary fuel consumption was 63,518,018 GJ/y increasing to 119,956,465 GJ/y by 2025. This is associated with emissions of 1,129,626 t/y of carbon in 1990 rising to 1,303,282 t/y by 2025. In 1990, 22,565,713 GJ/y of the primary fuel consumption was used for generating electricity rising to 23,478,050 GJ/y in 2025. If targets to reduce primary fuel consumption are reached, then emissions of carbon would fall to 1,042,626 by 2025, if renewable energy targets were also reached then emissions of carbon would fall to 988,638 t/y by 2025.
Resumo:
As a new medium for questionnaire delivery, the internet has the potential to revolutionize the survey process. Online (web-based) questionnaires provide several advantages over traditional survey methods in terms of cost, speed, appearance, flexibility, functionality, and usability. Designers of online-questionnaires are faced with a plethora of design tools to assist in the development of their electronic questionnaires. Little, if any, support is incorporated, however, within these tools to guide online-questionnaire designers according to best practice. In essence, an online-questionnaire combines questionnaire-based survey functionality with that of a webpage/site. As such, the design of an online-questionnaire should incorporate principles from both contributing fields. Drawing on existing guidelines for paper-based questionnaire design, website design (paying particular attention to issues of accessibility and usability), and existing but scarce guidelines for electronic surveys, we have derived a comprehensive set of guidelines for the design of online-questionnaires. This article introduces this comprehensive set of guidelines – as a practical reference guide – for the design of online-questionnaires.
Resumo:
As a new medium for questionnaire delivery, the internet has the potential to revolutionise the survey process. Online (web-based) questionnaires provide several advantages over traditional survey methods in terms of cost, speed, appearance, flexibility, functionality, and usability [1, 2]. For instance, delivery is faster, responses are received more quickly, and data collection can be automated or accelerated [1- 3]. Online-questionnaires can also provide many capabilities not found in traditional paper-based questionnaires: they can include pop-up instructions and error messages; they can incorporate links; and it is possible to encode difficult skip patterns making such patterns virtually invisible to respondents. Like many new technologies, however, online-questionnaires face criticism despite their advantages. Typically, such criticisms focus on the vulnerability of online-questionnaires to the four standard survey error types: namely, coverage, non-response, sampling, and measurement errors. Although, like all survey errors, coverage error (“the result of not allowing all members of the survey population to have an equal or nonzero chance of being sampled for participation in a survey” [2, pg. 9]) also affects traditional survey methods, it is currently exacerbated in online-questionnaires as a result of the digital divide. That said, many developed countries have reported substantial increases in computer and internet access and/or are targeting this as part of their immediate infrastructural development [4, 5]. Indicating that familiarity with information technologies is increasing, these trends suggest that coverage error will rapidly diminish to an acceptable level (for the developed world at least) in the near future, and in so doing, positively reinforce the advantages of online-questionnaire delivery. The second error type – the non-response error – occurs when individuals fail to respond to the invitation to participate in a survey or abandon a questionnaire before it is completed. Given today’s societal trend towards self-administration [2] the former is inevitable, irrespective of delivery mechanism. Conversely, non-response as a consequence of questionnaire abandonment can be relatively easily addressed. Unlike traditional questionnaires, the delivery mechanism for online-questionnaires makes estimation of questionnaire length and time required for completion difficult1, thus increasing the likelihood of abandonment. By incorporating a range of features into the design of an online questionnaire, it is possible to facilitate such estimation – and indeed, to provide respondents with context sensitive assistance during the response process – and thereby reduce abandonment while eliciting feelings of accomplishment [6]. For online-questionnaires, sampling error (“the result of attempting to survey only some, and not all, of the units in the survey population” [2, pg. 9]) can arise when all but a small portion of the anticipated respondent set is alienated (and so fails to respond) as a result of, for example, disregard for varying connection speeds, bandwidth limitations, browser configurations, monitors, hardware, and user requirements during the questionnaire design process. Similarly, measurement errors (“the result of poor question wording or questions being presented in such a way that inaccurate or uninterpretable answers are obtained” [2, pg. 11]) will lead to respondents becoming confused and frustrated. Sampling, measurement, and non-response errors are likely to occur when an online-questionnaire is poorly designed. Individuals will answer questions incorrectly, abandon questionnaires, and may ultimately refuse to participate in future surveys; thus, the benefit of online questionnaire delivery will not be fully realized. To prevent errors of this kind2, and their consequences, it is extremely important that practical, comprehensive guidelines exist for the design of online questionnaires. Many design guidelines exist for paper-based questionnaire design (e.g. [7-14]); the same is not true for the design of online questionnaires [2, 15, 16]. The research presented in this paper is a first attempt to address this discrepancy. Section 2 describes the derivation of a comprehensive set of guidelines for the design of online-questionnaires and briefly (given space restrictions) outlines the essence of the guidelines themselves. Although online-questionnaires reduce traditional delivery costs (e.g. paper, mail out, and data entry), set up costs can be high given the need to either adopt and acquire training in questionnaire development software or secure the services of a web developer. Neither approach, however, guarantees a good questionnaire (often because the person designing the questionnaire lacks relevant knowledge in questionnaire design). Drawing on existing software evaluation techniques [17, 18], we assessed the extent to which current questionnaire development applications support our guidelines; Section 3 describes the framework used for the evaluation, and Section 4 discusses our findings. Finally, Section 5 concludes with a discussion of further work.
Resumo:
As a new medium for questionnaire delivery, the Internet has the potential to revolutionize the survey process. Online-questionnaires can provide many capabilities not found in traditional paper-based questionnaires. Despite this, and the introduction of a plethora of tools to support online-questionnaire creation, current electronic survey design typically replicates the look-and-feel of paper-based questionnaires, thus failing to harness the full power of the electronic delivery medium. A recent environmental scan of online-questionnaire design tools found that little, if any, support is incorporated within these tools to guide questionnaire designers according to best-practice [Lumsden & Morgan 2005]. This paper briefly introduces a comprehensive set of guidelines for the design of online-questionnaires. Drawn from relevant disparate sources, all the guidelines incorporated within the set are proven in their own right; as an initial assessment of the value of the set of guidelines as a practical reference guide, we undertook an informal study to observe the effect of introducing the guidelines into the design process for a complex online-questionnaire. The paper discusses the qualitative findings — which are encouraging for the role of the guidelines in the ‘bigger picture’ of online survey delivery across many domains such as e-government, e-business, and e-health — of this case study.
Resumo:
As a new medium for questionnaire delivery, the internet has the potential to revolutionise the survey process. Online (web-based) questionnaires provide several advantages over traditional survey methods in terms of cost, speed, appearance, flexibility, functionality, and usability (Bandilla et al., 2003; Dillman, 2000; Kwak and Radler, 2002). Online-questionnaires can also provide many capabilities not found in traditional paper-based questionnaires: they can include pop-up instructions and error messages; they can incorporate links; and it is possible to encode difficult skip patterns making such patterns virtually invisible to respondents. Despite this, and the introduction of numerous tools to support online-questionnaire creation, current electronic survey design typically replicates that of paper-based questionnaires, failing to harness the full power of the electronic delivery medium. Worse, a recent environmental scan of online-questionnaire design tools found that little, if any, support is incorporated within these tools to guide questionnaire designers according to best-practice (Lumsden and Morgan, 2005). This article introduces a comprehensive set of guidelines - a practical reference guide - for the design of online-questionnaires.
Resumo:
METPEX is a 3 year, FP7 project which aims to develop a PanEuropean tool to measure the quality of the passenger's experience of multimodal transport. Initial work has led to the development of a comprehensive set of variables relating to different passenger groups, forms of transport and journey stages. This paper addresses the main challenges in transforming the variables into usable, accessible computer based tools allowing for the real time collection of information, across multiple journey stages in different EU countries. Non-computer based measurement instruments will be used to gather information from those who may not have or be familiar with mobile technology. Smartphone-based measurement instruments will also be used, hosted in two applications. The mobile applications need to be easy to use, configurable and adaptable according to the context of use. They should also be inherently interesting and rewarding for the participant, whilst allowing for the collection of high quality, valid and reliable data from all journey types and stages (from planning, through to entry into and egress from different transport modes, travel on public and personal vehicles and support of active forms of transport (e.g. cycling and walking). During all phases of the data collection and processing, the privacy of the participant is highly regarded and is ensured. © 2014 Springer International Publishing.
Resumo:
Archival research was conducted on the inception of preemployment psychological testing, as part of the background screening process, to select police officers for a local police department. Various issues and incidents were analyzed to help explain why this police department progressed from an abbreviated version of a psychological battery, to a much more sophisticated and comprehensive set of instruments. While doubts about psychological exams do exist, research has shown that many are valid and reliable in predicting job performance of police candidates. During a three year period, a police department hired 162 candidates (133 males and 29 females) who received "acceptable" psychological ratings and 71 candidates (58 males and 13 females) who received "marginal" psychological ratings. A document analysis consisted of variables that have been identified as job performance indicators which police psychological testing tries to predict, and "screen in" or "screen out" appropriate applicants. The areas of focus comprised the 6-month police academy, the 4-month Field Training Officer (FTO) Program, the remaining probationary period, and yearly performance up to five years of employment. Specific job performance variables were the final academy grade average, supervisors' evaluation ratings, reprimands, commendations, awards, citizen complaints, time losses, sick time usage, reassignments, promotions, and separations. A causal-comparative research design was used to determine if there were significant statistical differences in these job performance variables between police officers with "acceptable" psychological ratings and police officers with "marginal" psychological ratings. The results of multivariate analyses of variance, t-tests, and chi-square procedures as applicable, showed no significant differences between the two groups on any of the job performance variables.
Resumo:
The Ice Station POLarstern (ISPOL) cruise revisited the western Weddell Sea in late 2004 and obtained a comprehensive set of conductivity-temperature-depth (CTD) data. This study describes the thermohaline structure and diapycnal mixing environment observed in 2004 and compares them with conditions observed more than a decade earlier. Hydrographic conditions on the central western Weddell Sea continental slope, off Larsen C Ice Shelf, in late winter/early spring of 2004/2005 can be described as a well-stratified environment with upper layers evidencing relict structures from intense winter near-surface vertical fluxes, an intermediate depth temperature maximum, and a cold near-bottom layer marked by patchy property distributions. A well-developed surface mixed layer, isolated from the underlying Warm Deep Water (WDW) by a pronounced pycnocline and characterized by lack of warming and by minimal sea-ice basal melting, supports the assumption that upper ocean winter conditions persisted during most of the ISPOL experiment. Much of the western Weddell Sea water column has remained essentially unchanged since 1992; however, significant differences were observed in two of the regional water masses. The first, Modified Weddell Deep Water (MWDW), comprises the permanent pycnocline and was less saline than a decade earlier, whereas Weddell Sea Bottom Water (WSBW) was horizontally patchier and colder. Near-bottom temperatures observed in 2004 were the coldest on record for the western Weddell Sea over the continental slope. Minimum temperatures were ~0.4 and ~0.3 °C colder than during 1992-1993, respectively. The 2004 near-bottom temperature/salinity characteristics revealed the presence of two different WSBW types, whereby a warm, fresh layer overlays a colder, saltier layer (both formed in the western Weddell Sea). The deeper layer may have formed locally as high salinity shelf water (HSSW) that flowed intermittently down the continental slope, which is consistent with the observed horizontal patchiness. The latter can be associated with the near-bottom variability found in Powell Basin with consequences for the deep water outflow from the Weddell Sea.
Resumo:
The observation chart is for many health professionals (HPs) the primary source of objective information relating to the health of a patient. Information Systems (IS) research has demonstrated the positive impact of good interface design on decision making and it is logical that good observation chart design can positively impact healthcare decision making. Despite the potential for good observation chart design, there is a paucity of observation chart design literature, with the primary source of literature leveraging Human Computer Interaction (HCI) literature to design better charts. While this approach has been successful, this design approach introduces a gap between understanding of the tasks performed by HPs when using charts and the design features implemented in the chart. Good IS allow for the collection and manipulation of data so that it can be presented in a timely manner that support specific tasks. Good interface design should therefore consider the specific tasks being performed prior to designing the interface. This research adopts a Design Science Research (DSR) approach to formalise a framework of design principles that incorporates knowledge of the tasks performed by HPs when using observation charts and knowledge pertaining to visual representations of data and semiology of graphics. This research is presented in three phases, the initial two phases seek to discover and formalise design knowledge embedded in two situated observation charts: the paper-based NEWS chart developed by the Health Service Executive in Ireland and the electronically generated eNEWS chart developed by the Health Information Systems Research Centre in University College Cork. A comparative evaluation of each chart is also presented in the respective phases. Throughout each of these phases, tentative versions of a design framework for electronic vital sign observation charts are presented, with each subsequent iteration of the framework (versions Alpha, Beta, V0.1 and V1.0) representing a refinement of the design knowledge. The design framework will be named the framework for the Retrospective Evaluation of Vital Sign Information from Early Warning Systems (REVIEWS). Phase 3 of the research presents the deductive process for designing and implementing V0.1 of the framework, with evaluation of the instantiation allowing for the final iteration V1.0 of the framework. This study makes a number of contributions to academic research. First the research demonstrates that the cognitive tasks performed by nurses during clinical reasoning can be supported through good observation chart design. Secondly the research establishes the utility of electronic vital sign observation charts in terms of supporting the cognitive tasks performed by nurses during clinical reasoning. Third the framework for REVIEWS represents a comprehensive set of design principles which if applied to chart design will improve the usefulness of the chart in terms of supporting clinical reasoning. Fourth the electronic observation chart that emerges from this research is demonstrated to be significantly more useful than previously designed charts and represents a significant contribution to practice. Finally the research presents a research design that employs a combination of inductive and deductive design activities to iterate on the design of situated artefacts.
Resumo:
Diabetes is the leading cause of end stage renal disease. Despite evidence for a substantial heritability of diabetic kidney disease, efforts to identify genetic susceptibility variants have had limited success. We extended previous efforts in three dimensions, examining a more comprehensive set of genetic variants in larger numbers of subjects with type 1 diabetes characterized for a wider range of cross-sectional diabetic kidney disease phenotypes. In 2,843 subjects, we estimated that the heritability of diabetic kidney disease was 35% ( p=6x10-3 ). Genome-wide association analysis and replication in 12,540 individuals identified no single variants reaching stringent levels of significance and, despite excellent power, provided little independent confirmation of previously published associated variants. Whole exome sequencing in 997 subjects failed to identify any large-effect coding alleles of lower frequency influencing the risk of diabetic kidney disease. However, sets of alleles increasing body mass index ( p=2.2×10-5) and the risk of type 2 diabetes (p=6.1x10-4 ) were associated with the risk of diabetic kidney disease. We also found genome-wide genetic correlation between diabetic kidney disease and failure at smoking cessation ( p=1.1×10-4 ). Pathway analysis implicated ascorbate and aldarate metabolism ( p=9×10-6), and pentose and glucuronate interconversions ( p=3×10-6) in pathogenesis of diabetic kidney disease. These data provide further evidence for the role of genetic factors influencing diabetic kidney disease in those with type 1 diabetes and highlight some key pathways that may be responsible. Altogether these results reveal important biology behind the major cause of kidney disease.
Resumo:
A decree of the Star Chamber designed to regulate the printing of all literary works, whether ecclesiastical or secular in nature. The decree further entrenched the significance and validity of ‘stationers' copyright' in requiring that no work be printed without first being entered on the Company of Stationers' Register Book. The decree also provided that any materials printed thereafter were to carry both the name of the printer and the author of the work.
The commentary describes how, by comparison with earlier decrees (see: uk_1566; uk_1586), the 1637 Decree provided a more elaborate system for licensing both ecclesiastical and secular works as well as a more comprehensive set of regulations to govern the operation of the printing trade. As a regulatory measure, it is widely regarded as representing the high point of the Company of Stationers' control and authority over the book trade.
Resumo:
Legislation prohibiting the publication of any literary work without prior licence.
Drawing upon both the Star Chamber Decree 1637 (uk_1637) and the Acts Regulating Printing during the Interregnum (see: uk_1643 and associated documents), the Licensing Act set out a comprehensive set of provisions concerning both the licensing of the press and the regulation and management of the book trade. In addition, it confirmed the rights of those holding printing privileges (or patents) granted in accordance with the royal prerogative (see for example: Day's privilege for The Cosmographical Glass (uk_1559b)) as well as those who had registered works with the Stationers' Company (uk_1557). It also introduced the first legal library deposit requirement. In force between 1662 and 1679, and then again between 1685 and 1695, the Act represents the last occasion on which the censorship of the press was formally and strategically linked to the protection of the economic interests of the Stationers' Company. Its lapse led the Stationers' Company to lobby parliament for renewed protection, ultimately resulting in the passing of the Statute of Anne 1710 (uk_1710).
Resumo:
End users urgently request using mobile devices at their workplace. They know these devices from their private life and appreciate functionality and usability, and want to benefit from these advantages at work as well. Limitations and restrictions would not be accepted by them. On the contrary, companies are obliged to employ substantial organizational and technical measures to ensure data security and compliance when allowing to use mobile devices at the workplace. So far, only individual arrangements have been presented addressing single issues in ensuring data security and compliance. However, companies need to follow a comprehensive set of measures addressing all relevant aspects of data security and compliance in order to play it safe. Thus, in this paper at first technical architectures for using mobile devices in enterprise IT are reviewed. Thereafter a set of compliance rules is presented and, as major contribution, technical measures are explained that enable a company to integrate mobile devices into enterprise IT while still complying with these rules comprehensively. Depending on the company context, one or more of the technical architectures have to be chosen impacting the specific technical measures for compliance as elaborated in this paper. Altogether this paper, for the first time, correlates technical architectures for using mobile devices at the workplace with technical measures to assure data security and compliance according to a comprehensive set of rules.