50 resultados para Election Counting and Reporting Software,
Resumo:
Several authors have demonstrated an increased number of mitotic figures in breast cancer resection specimen when compared with biopsy material. This has been ascribed to a sampling artifact where biopsies are (i) either too small to allow formal mitotic figure counting or (ii) not necessarily taken form the proliferating tumor periphery. Herein, we propose a different explanation for this phenomenon. Biopsy and resection material of 52 invasive ductal carcinomas was studied. We counted mitotic figures in 10 representative high power fields and quantified MIB-1 immunohistochemistry by visual estimation, counting and image analysis. We found that mitotic figures were elevated by more than three-fold on average in resection specimen over biopsy material from the same tumors (20±6 vs 6±2 mitoses per 10 high power fields, P=0.008), and that this resulted in a relative diminution of post-metaphase figures (anaphase/telophase), which made up 7% of all mitotic figures in biopsies but only 3% in resection specimen (P<0.005). At the same time, the percentages of MIB-1 immunostained tumor cells among total tumor cells were comparable in biopsy and resection material, irrespective of the mode of MIB-1 quantification. Finally, we found no association between the size of the biopsy material and the relative increase of mitotic figures in resection specimen. We propose that the increase in mitotic figures in resection specimen and the significant shift towards metaphase figures is not due to a sampling artifact, but reflects ongoing cell cycle activity in the resected tumor tissue due to fixation delay. The dwindling energy supply will eventually arrest tumor cells in metaphase, where they are readily identified by the diagnostic pathologist. Taken together, we suggest that the rapidly fixed biopsy material better represents true tumor biology and should be privileged as predictive marker of putative response to cytotoxic chemotherapy.
Resumo:
PURPOSE: To evaluate the feasibility of radioimmunotherapy (RIT) with radiolabeled anti-carcinoembryonic antigen antibodies after complete resection of liver metastases (LM) from colorectal cancer. Patients and Methods: Twenty-two patients planned for surgery of one to four LM received a preoperative diagnostic dose of a 131I-F(ab')2-labeled anti-carcinoembryonic antigen monoclonal antibody F6 (8-10 mCi/5 mg). 131I-F(ab')2 uptake was analyzed using direct radioactivity counting, and tumor-to-normal liver ratios were recorded. Ten patients with tumor-to-normal liver ratios of >5 and three others were treated with a therapeutic injection [180-200 mCi 131I/50 mg F(ab')2] 30 to 64 days after surgery. RESULTS: Median 131I-F(ab')2 immunoreactivity in patient serum remained at 91% of initial values for up to 96 hours after injection. The main and dose-limiting-toxicity was hematologic, with 92% and 85% grades 3 to 4 neutropenia and thrombocytopenia, respectively. Complete spontaneous recovery occurred in all patients. No human anti-mouse antibody response was observed after the diagnosis dose; however, 10 of the 13 treated patients developed human anti-mouse antibody approximately 3 months later. Two treated patients presented extrahepatic metastases at the time of RIT (one bone and one abdominal node) and two relapsed within 3 months of RIT (one in the lung and the other in the liver). Two patients are still alive, and one of these is disease-free at 93 months after resection. At a median follow-up of 127 months, the median disease-free survival is 12 months and the median overall survival is 50 months. CONCLUSION: RIT is feasible in an adjuvant setting after complete resection of LM from colorectal cancer and should be considered for future trials, possibly in combination with chemotherapy, because of the generally poor prognosis of these patients.
Resumo:
PURPOSE: The aim of this study was to test whether oligonucleotide-targeted gene repair can correct the point mutation in genomic DNA of PDE6b(rd1) (rd1) mouse retinas in vivo. METHODS: Oligonucleotides (ODNs) of 25 nucleotide length and complementary to genomic sequence subsuming the rd1 point mutation in the gene encoding the beta-subunit of rod photoreceptor cGMP-phosphodiesterase (beta-PDE), were synthesized with a wild type nucleotide base at the rd1 point mutation position. Control ODNs contained the same nucleotide bases as the wild type ODNs but with varying degrees of sequence mismatch. We previously developed a repeatable and relatively non-invasive technique to enhance ODN delivery to photoreceptor nuclei using transpalpebral iontophoresis prior to intravitreal ODN injection. Three such treatments were performed on C3H/henJ (rd1) mouse pups before postnatal day (PN) 9. Treatment outcomes were evaluated at PN28 or PN33, when retinal degeneration was nearly complete in the untreated rd1 mice. The effect of treatment on photoreceptor survival was evaluated by counting the number of nuclei of photoreceptor cells and by assessing rhodopsin immunohistochemistry on flat-mount retinas and sections. Gene repair in the retina was quantified by allele-specific real time PCR and by detection of beta-PDE-immunoreactive photoreceptors. Confirmatory experiments were conducted using independent rd1 colonies in separate laboratories. These experiments had an additional negative control ODN that contained the rd1 mutant nucleotide base at the rd1 point mutation site such that the sole difference between treatment with wild type and control ODN was the single base at the rd1 point mutation site. RESULTS: Iontophoresis enhanced the penetration of intravitreally injected ODNs in all retinal layers. Using this delivery technique, significant survival of photoreceptors was observed in retinas from eyes treated with wild type ODNs but not control ODNs as demonstrated by cell counting and rhodopsin immunoreactivity at PN28. Beta-PDE immunoreactivity was present in retinas from eyes treated with wild type ODN but not from those treated with control ODNs. Gene correction demonstrated by allele-specific real time PCR and by counts of beta-PDE-immunoreactive cells was estimated at 0.2%. Independent confirmatory experiments showed that retinas from eyes treated with wild type ODN contained many more rhodopsin immunoreactive cells compared to retinas treated with control (rd1 sequence) ODN, even when harvested at PN33. CONCLUSIONS: Short ODNs can be delivered with repeatable efficiency to mouse photoreceptor cells in vivo using a combination of intravitreal injection and iontophoresis. Delivery of therapeutic ODNs to rd1 mouse eyes resulted in genomic DNA conversion from mutant to wild type sequence, low but observable beta-PDE immunoreactivity, and preservation of rhodopsin immunopositive cells in the outer nuclear layer, suggesting that ODN-directed gene repair occurred and preserved rod photoreceptor cells. Effects were not seen in eyes treated with buffer or with ODNs having the rd1 mutant sequence, a definitive control for this therapeutic approach. Importantly, critical experiments were confirmed in two laboratories by several different researchers using independent mouse colonies and ODN preparations from separate sources. These findings suggest that targeted gene repair can be achieved in the retina following enhanced ODN delivery.
Resumo:
A solution of (18)F was standardised with a 4pibeta-4pigamma coincidence counting system in which the beta detector is a one-inch diameter cylindrical UPS89 plastic scintillator, positioned at the bottom of a well-type 5''x5'' NaI(Tl) gamma-ray detector. Almost full detection efficiency-which was varied downwards electronically-was achieved in the beta-channel. Aliquots of this (18)F solution were also measured using 4pigamma NaI(Tl) integral counting and Monte Carlo calculated efficiencies as well as the CIEMAT-NIST method. Secondary measurements of the same solution were also performed with an IG11 ionisation chamber whose equivalent activity is traceable to the Système International de Référence through the contribution IRA-METAS made to it in 2001; IRA's degree of equivalence was found to be close to the key comparison reference value (KCRV). The (18)F activity predicted by this coincidence system agrees closely with the ionisation chamber measurement and is compatible within one standard deviation of the other primary measurements. This work demonstrates that our new coincidence system can standardise short-lived radionuclides used in nuclear medicine.
Resumo:
This study describes major electrocardiogram (ECG) measurements and diagnoses in a population of African individuals; most reference data have been collected in Caucasian populations and evidence exists for interethnic differences in ECG findings. This study was conducted in the Seychelles islands (Indian Ocean) and included 709 black individuals (343 men and 366 women) aged 25 to 64 years randomly selected from the general population. Resting ECG were recorded by using a validated ECG unit equipped with a measurement and interpretation software (Cardiovit AT-6, Schiller, Switzerland). The epidemiology of 14 basic ECG measurements, 6 composite criteria for left ventricular hypertrophy and 19 specific ECG diagnoses including abnormal rhythms, conduction abnormalities, repolarization abnormalities, and myocardial infarction were examined. Substantial gender and age differences were found for several ECG parameters. Moreover, tracings recorded in African individuals of the Seychelles differed from those collected similarly in Caucasian populations in many respects. For instance, heart rate was approximately 5 beats per minute lower in the African individuals than in selected Caucasian populations, prevalence of first degree atrio-ventricular block was especially high (4.8%), and the average Sokolow-Lyon voltage was markedly higher in African individuals of the Seychelles compared with black and white Americans. The integrated interpretation software detected "old myocardial infarction" in 3.8% of men and 0% of women and "old myocardial infarction possible" in 6.1% and 3%, respectively. Cardiac infarction injury scores are also provided. In conclusion, the study provides reference values for ECG findings in a specific population of people of African descent and stresses the need to systematically consider gender, age, and ethnicity when interpreting ECG tracings in individuals.
Resumo:
Cloud computing and its three facets (Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS)) are terms that denote new developments in the software industry. In particular, PaaS solutions, also referred to as cloud platforms, are changing the way software is being produced, distributed, consumed, and priced. Software vendors have started considering cloud platforms as a strategic option but are battling to redefine their offerings to embrace PaaS. In contrast to SaaS and IaaS, PaaS allows for value co-creation with partners to develop complementary components and applications. It thus requires multisided business models that bring together two or more distinct customer segments. Understanding how to design PaaS business models to establish a flourishing ecosystem is crucial for software vendors. This doctoral thesis aims to address this issue in three interrelated research parts. First, based on case study research, the thesis provides a deeper understanding of current PaaS business models and their evolution. Second, it analyses and simulates consumers' preferences regarding PaaS business models, using a conjoint approach to find out what determines the choice of cloud platforms. Finally, building on the previous research outcomes, the third part introduces a design theory for the emerging class of PaaS business models, which is grounded on an extensive action design research study with a large European software vendor. Understanding PaaS business models from a market as well as a consumer perspective will, together with the design theory, inform and guide decision makers in their business model innovation plans. It also closes gaps in the research related to PaaS business model design and more generally related to platform business models.
Resumo:
ExPASy (http://www.expasy.org) has worldwide reputation as one of the main bioinformatics resources for proteomics. It has now evolved, becoming an extensible and integrative portal accessing many scientific resources, databases and software tools in different areas of life sciences. Scientists can henceforth access seamlessly a wide range of resources in many different domains, such as proteomics, genomics, phylogeny/evolution, systems biology, population genetics, transcriptomics, etc. The individual resources (databases, web-based and downloadable software tools) are hosted in a 'decentralized' way by different groups of the SIB Swiss Institute of Bioinformatics and partner institutions. Specifically, a single web portal provides a common entry point to a wide range of resources developed and operated by different SIB groups and external institutions. The portal features a search function across 'selected' resources. Additionally, the availability and usage of resources are monitored. The portal is aimed for both expert users and people who are not familiar with a specific domain in life sciences. The new web interface provides, in particular, visual guidance for newcomers to ExPASy.
Resumo:
Ambulatory blood pressure monitoring (ABPM) is being used increasingly in both clinical practice and hypertension research. Although there are many guidelines that emphasize the indications for ABPM, there is no comprehensive guideline dealing with all aspects of the technique. It was agreed at a consensus meeting on ABPM in Milan in 2011 that the 34 attendees should prepare a comprehensive position paper on the scientific evidence for ABPM.This position paper considers the historical background, the advantages and limitations of ABPM, the threshold levels for practice, and the cost-effectiveness of the technique. It examines the need for selecting an appropriate device, the accuracy of devices, the additional information and indices that ABPM devices may provide, and the software requirements.At a practical level, the paper details the requirements for using ABPM in clinical practice, editing considerations, the number of measurements required, and the circumstances, such as obesity and arrhythmias, when particular care needs to be taken when using ABPM.The clinical indications for ABPM, among which white-coat phenomena, masked hypertension, and nocturnal hypertension appear to be prominent, are outlined in detail along with special considerations that apply in certain clinical circumstances, such as childhood, the elderly and pregnancy, and in cardiovascular illness, examples being stroke and chronic renal disease, and the place of home measurement of blood pressure in relation to ABPM is appraised.The role of ABPM in research circumstances, such as pharmacological trials and in the prediction of outcome in epidemiological studies is examined and finally the implementation of ABPM in practice is considered in relation to the issue of reimbursement in different countries, the provision of the technique by primary care practices, hospital clinics and pharmacies, and the growing role of registries of ABPM in many countries.
Resumo:
The authors developed a free-breathing black-blood coronary magnetic resonance (MR) angiographic technique with a potential for exclusive visualization of the coronary blood pool. Results with the MR angiographic technique were evaluated in eight healthy subjects and four patients with coronary disease identified at conventional angiography. This MR angiographic technique accurately depicted luminal disease in the patients and permitted visualization of extensive continuous segments of the native coronary tree in both the healthy subjects and the patients. Black-blood coronary MR angiography provides an alternative source of contrast enhancement.
Resumo:
OBJECTIVE: This study sought to determine the prevalence of transactional sex among university students in Uganda and to assess the possible relationship between transactional sex and sexual coercion, physical violence, mental health, and alcohol use. METHODS: In 2010, 1954 undergraduate students at a Ugandan university responded to a self-administered questionnaire that assessed mental health, substance use, physical violence and sexual behaviors including sexual coercion and transactional sex. The prevalence of transactional sex was assessed and logistic regression analysis was performed to measure the associations between various risk factors and reporting transactional sex. RESULTS: Approximately 25% of the study sample reported having taken part in transactional sex, with more women reporting having accepted money, gifts or some compensation for sex, while more men reporting having paid, given a gift or otherwise compensated for sex. Sexual coercion in men and women was significantly associated with having accepted money, gifts or some compensation for sex. Men who were victims of physical violence in the last 12 months had higher probability of having accepted money, gifts or some compensation for sex than other men. Women who were victims of sexual coercion reported greater likelihood of having paid, given a gift or otherwise compensated for sex. Respondents who had been victims of physical violence in last 12 months, engaged in heavy episodic drinking and had poor mental health status were more likely to have paid, given a gift or otherwise compensated for sex. CONCLUSIONS: University students in Uganda are at high risk of transactional sex. Young men and women may be equally vulnerable to the risks and consequences of transactional sex and should be included in program initiatives to prevent transactional sex. The role of sexual coercion, physical violence, mental health, and alcohol use should be considered when designing interventions for countering transactional sex.
Resumo:
In Neo-Darwinism, variation and natural selection are the two evolutionary mechanisms which propel biological evolution. Our previous reports presented a histogram model to simulate the evolution of populations of individuals classified into bins according to an unspecified, quantifiable phenotypic character, and whose number in each bin changed generation after generation under the influence of fitness, while the total population was maintained constant. The histogram model also allowed Shannon entropy (SE) to be monitored continuously as the information content of the total population decreased or increased. Here, a simple Perl (Practical Extraction and Reporting Language) application was developed to carry out these computations, with the critical feature of an added random factor in the percent of individuals whose offspring moved to a vicinal bin. The results of the simulations demonstrate that the random factor mimicking variation increased considerably the range of values covered by Shannon entropy, especially when the percentage of changed offspring was high. This increase in information content is interpreted as facilitated adaptability of the population.
Resumo:
Since the development of the first whole-cell living biosensor or bioreporter about 15 years ago, construction and testing of new genetically modified microorganisms for environmental sensing and reporting has proceeded at an ever increasing rate. One and a half decades appear as a reasonable time span for a new technology to reach the maturity needed for application and commercial success. It seems, however, that the research into cellular biosensors is still mostly in a proof-of-principle or demonstration phase and not close to extensive or commercial use outside of academia. In this review, we consider the motivations for bioreporter developments and discuss the suitability of extant bioreporters for the proposed applications to stimulate complementary research and to help researchers to develop realistic objectives. This includes the identification of some popular misconceptions about the qualities and shortcomings of bioreporters.
Resumo:
EMBnet is a consortium of collaborating bioinformatics groups located mainly within Europe (http://www.embnet.org). Each member country is represented by a 'node', a group responsible for the maintenance of local services for their users (e.g. education, training, software, database distribution, technical support, helpdesk). Among these services a web portal with links and access to locally developed and maintained software is essential and different for each node. Our web portal targets biomedical scientists in Switzerland and elsewhere, offering them access to a collection of important sequence analysis tools mirrored from other sites or developed locally. We describe here the Swiss EMBnet node web site (http://www.ch.embnet.org), which presents a number of original services not available anywhere else.
Resumo:
Dual-energy X-ray absorptiometry (DXA) is commonly used in the care of patients for diagnostic classification of osteoporosis, low bone mass (osteopenia), or normal bone density; assessment of fracture risk; and monitoring changes in bone density over time. The development of other technologies for the evaluation of skeletal health has been associated with uncertainties regarding their applications in clinical practice. Quantitative ultrasound (QUS), a technology for measuring properties of bone at peripheral skeletal sites, is more portable and less expensive than DXA, without the use of ionizing radiation. The proliferation of QUS devices that are technologically diverse, measuring and reporting variable bone parameters in different ways, examining different skeletal sites, and having differing levels of validating data for association with DXA-measured bone density and fracture risk, has created many challenges in applying QUS for use in clinical practice. The International Society for Clinical Densitometry (ISCD) 2007 Position Development Conference (PDC) addressed clinical applications of QUS for fracture risk assessment, diagnosis of osteoporosis, treatment initiation, monitoring of treatment, and quality assurance/quality control. The ISCD Official Positions on QUS resulting from this PDC, the rationale for their establishment, and recommendations for further study are presented here.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.