11 resultados para Science projects
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Over the last decade, translational science has come into the focus of academic medicine, and significant intellectual and financial efforts have been made to initiate a multitude of bench-to-bedside projects. The quest for suitable biomarkers that will significantly change clinical practice has become one of the biggest challenges in translational medicine. Quantitative measurement of proteins is a critical step in biomarker discovery. Assessing a large number of potential protein biomarkers in a statistically significant number of samples and controls still constitutes a major technical hurdle. Multiplexed analysis offers significant advantages regarding time, reagent cost, sample requirements and the amount of data that can be generated. The two contemporary approaches in multiplexed and quantitative biomarker validation, antibody-based immunoassays and MS-based multiple (or selected) reaction monitoring, are based on different assay principles and instrument requirements. Both approaches have their own advantages and disadvantages and therefore have complementary roles in the multi-staged biomarker verification and validation process. In this review, we discuss quantitative immunoassay and multiple reaction monitoring/selected reaction monitoring assay principles and development. We also discuss choosing an appropriate platform, judging the performance of assays, obtaining reliable, quantitative results for translational research and clinical applications in the biomarker field.
Resumo:
BACKGROUND: Only data of published study results are available to the scientific community for further use such as informing future research and synthesis of available evidence. If study results are reported selectively, reporting bias and distortion of summarised estimates of effect or harm of treatments can occur. The publication and citation of results of clinical research conducted in Germany was studied. METHODS: The protocols of clinical research projects submitted to the research ethics committee of the University of Freiburg (Germany) in 2000 were analysed. Published full articles in several databases were searched and investigators contacted. Data on study and publication characteristics were extracted from protocols and corresponding publications. RESULTS: 299 study protocols were included. The most frequent study design was randomised controlled trial (141; 47%), followed by uncontrolled studies (61; 20%), laboratory studies (30; 10%) and non-randomised studies (29; 10%). 182 (61%) were multicentre studies including 97 (53%) international collaborations. 152 of 299 (51%) had commercial (co-)funding and 46 (15%) non-commercial funding. 109 of the 225 completed protocols corresponded to at least one full publication (total 210 articles); the publication rate was 48%. 168 of 210 identified publications (80%) were cited in articles indexed in the ISI Web of Science. The median was 11 citations per publication (range 0-1151). CONCLUSIONS: Results of German clinical research projects conducted are largely underreported. Barriers to successful publication need to be identified and appropriate measures taken. Close monitoring of projects until publication and adequate support provided to investigators may help remedy the prevailing underreporting of research.
Resumo:
Gaining economic benefits from substantially lower labor costs has been reported as a major reason for offshoring labor-intensive information systems services to low-wage countries. However, if wage differences are so high, why is there such a high level of variation in the economic success between offshored IS projects? This study argues that offshore outsourcing involves a number of extra costs for the ^his paper was recommended for acceptance by Associate Guest Editor Erran Carmel. client organization that account for the economic failure of offshore projects. The objective is to disaggregate these extra costs into their constituent parts and to explain why they differ between offshored software projects. The focus is on software development and maintenance projects that are offshored to Indian vendors. A theoretical framework is developed a priori based on transaction cost economics (TCE) and the knowledge-based view of the firm, comple mented by factors that acknowledge the specific offshore context The framework is empirically explored using a multiple case study design including six offshored software projects in a large German financial service institution. The results of our analysis indicate that the client incurs post contractual extra costs for four types of activities: (1) re quirements specification and design, (2) knowledge transfer, (3) control, and (4) coordination. In projects that require a high level of client-specific knowledge about idiosyncratic business processes and software systems, these extra costs were found to be substantially higher than in projects where more general knowledge was needed. Notably, these costs most often arose independently from the threat of oppor tunistic behavior, challenging the predominant TCE logic of market failure. Rather, the client extra costs were parti cularly high in client-specific projects because the effort for managing the consequences of the knowledge asymmetries between client and vendor was particularly high in these projects. Prior experiences of the vendor with related client projects were found to reduce the level of extra costs but could not fully offset the increase in extra costs in highly client-specific projects. Moreover, cultural and geographic distance between client and vendor as well as personnel turnover were found to increase client extra costs. Slight evidence was found, however, that the cost-increasing impact of these factors was also leveraged in projects with a high level of required client-specific knowledge (moderator effect).
Resumo:
Knowledge processes are critical to outsourced software projects. According to outsourcing research, outsourced software projects succeed if they manage to integrate the client’s business knowledge and the vendor’s technical knowledge. In this paper, we submit that this view may not be wrong, but incomplete in a significant part of outsourced software work, which is software maintenance. Data from six software-maintenance outsourcing transitions indicate that more important than business or technical knowledge can be application knowledge, which vendor engineers acquire over time during practice. Application knowledge was the dominant knowledge during knowledge transfer activities and its acquisition enabled vendor staff to solve maintenance tasks. We discuss implications for widespread assumptions in outsourcing research.
Resumo:
Given the centrality of control for achieving success in outsourced software projects, past research has identified key exogenous factors that determine the choice of controls. This view of exogenously driven control choice is based on a number of assumptions; particularly, clients and vendors are seen as separate cognitive entities that combat opportunistic threats under environmental uncertainty by one-off choices or infrequent revisions of controls. In this paper we complement this perspective by acknowledging that an outsourced software project may be characterized as a collective, evolving process faced with the challenge of coping with cognitive limitations of both client and vendor through a continuous process of learning. We argue that if viewed in this way, controls are less subject of a deliberate choice but rather are subject of endogenously driven change, i.e. controls evolve in close interaction with the evolving software project. Accordingly, we suggest a complementary model of endogenous control, where controls mediate individual and collective learning processes. Our research contributes to a better understanding of the dynamics in outsourced software projects. It also spells out methodological implications that may help improve cross-section control research.
Resumo:
Project justification is regarded as one of the major methodological deficits in Data Warehousing practice. As reasons for applying inappropriate methods, performing incomplete evaluations, or even entirely omitting justifications, the special nature of Data Warehousing benefits and the large portion of infrastructure-related activities are stated. In this paper, the economic justification of Data Warehousing projects is analyzed, and first results from a large academiaindustry collaboration project in the field of non-technical issues of Data Warehousing are presented. As conceptual foundations, the role of the Data Warehouse system in corporate application architectures is analyzed, and the specific properties of Data Warehousing projects are discussed. Based on an applicability analysis of traditional approaches to economic IT project justification, basic steps and responsibilities for the justification of Data Warehousing projects are derived.
Resumo:
Sedimentary sequences in ancient or long-lived lakes can reach several thousands of meters in thickness and often provide an unrivalled perspective of the lake's regional climatic, environmental, and biological history. Over the last few years, deep-drilling projects in ancient lakes became increasingly multi- and interdisciplinary, as, among others, seismological, sedimentological, biogeochemical, climatic, environmental, paleontological, and evolutionary information can be obtained from sediment cores. However, these multi- and interdisciplinary projects pose several challenges. The scientists involved typically approach problems from different scientific perspectives and backgrounds, and setting up the program requires clear communication and the alignment of interests. One of the most challenging tasks, besides the actual drilling operation, is to link diverse datasets with varying resolution, data quality, and age uncertainties to answer interdisciplinary questions synthetically and coherently. These problems are especially relevant when secondary data, i.e., datasets obtained independently of the drilling operation, are incorporated in analyses. Nonetheless, the inclusion of secondary information, such as isotopic data from fossils found in outcrops or genetic data from extant species, may help to achieve synthetic answers. Recent technological and methodological advances in paleolimnology are likely to increase the possibilities of integrating secondary information. Some of the new approaches have started to revolutionize scientific drilling in ancient lakes, but at the same time, they also add a new layer of complexity to the generation and analysis of sediment-core data. The enhanced opportunities presented by new scientific approaches to study the paleolimnological history of these lakes, therefore, come at the expense of higher logistic, communication, and analytical efforts. Here we review types of data that can be obtained in ancient lake drilling projects and the analytical approaches that can be applied to empirically and statistically link diverse datasets to create an integrative perspective on geological and biological data. In doing so, we highlight strengths and potential weaknesses of new methods and analyses, and provide recommendations for future interdisciplinary deep-drilling projects.