983 resultados para Big Science
Resumo:
Se há conjunção fatal, é entre guerra e ciência, em nome dos imperativos da defesa nacional. É hoje um dado adquirido que a Segunda Guerra Mundial foi tão extraordinariamente produtiva no campo científico, quanto inumana, da biomedicina criminosa nazi à física atômica norte-americana. Muitas das aplicações técnicas com que actualmente convivemos de forma pacífica, na maior candura e na maior inocência, tiveram origem em investigação fundamental e aplicada desenvolvida no decurso da Segunda Guerra Mundial por ambas as partes em conflito. De resto, apudenda origo da Big Science que hoje nos é tão familiar, há que encontrá-la nos grandes projectos de investigação científica empreendida e apoiada pelos Estados beligerantes no decurso do conflito mundial e que a Guerra Fria mais não faz do que prolongar. Somos herdeiros de uma ciência prosseguida em nome da razão de Estado e dos superiores interesses da defesa nacional, inteiramente subordinada a fins bélicos, invocados quer pelos Aliados, quer pelo Eixo. Conhecem- se abundantes exemplos disso, do radar aos antibióticos e à energia nuclear, desenvolvidos do lado Aliado. Menos conhecidos, mas decerto que incomparavelmente mais inquietantes, são alguns frutos da experimentação médica levada a cabo no mundo concentracionário nazi, cujos resultados foram aproveitados pela ciência posterior.
Resumo:
ABSTRACT This dissertation focuses on new technology commercialization, innovation and new business development. Industry-based novel technology may achieve commercialization through its transfer to a large research laboratory acting as a lead user and technical partner, and providing the new technology with complementary assets and meaningful initial use in social practice. The research lab benefits from the new technology and innovation through major performance improvements and cost savings. Such mutually beneficial collaboration between the lab and the firm does not require any additional administrative efforts or funds from the lab, yet requires openness to technologies and partner companies that may not be previously known to the lab- Labs achieve the benefits by applying a proactive procurement model that promotes active pre-tender search of new technologies and pre-tender testing and piloting of these technological options. The collaboration works best when based on the development needs of both parties. This means that first of all the lab has significant engineering activity with well-defined technological needs and second, that the firm has advanced prototype technology yet needs further testing, piloting and the initial market and references to achieve the market breakthrough. The empirical evidence of the dissertation is based on a longitudinal multiple-case study with the European Laboratory for Particle Physics. The key theoretical contribution of this study is that large research labs, including basic research, play an important role in product and business development toward the end, rather than front-end, of the innovation process. This also implies that product-orientation and business-orientation can contribute to basic re-search. The study provides practical managerial and policy guidelines on how to initiate and manage mutually beneficial lab-industry collaboration and proactive procurement.
Resumo:
Le discours sur l’innovation oriente la recherche scientifique médicale publique vers un développement technologique et économique à court terme. À ce titre, la médecine régénératrice est une thérapie innovatrice marquée par une logique d’accumulation spéculative qui porte à la fois sur les cellules humaines et sur la façon de mener la recherche. Or, une réorganisation de la recherche scientifique liée à une nouvelle conception économique de la science et de la technologie ainsi qu’un rôle différent attribué à l’État constituent le cadre institutionnel contemporain qui émerge à la fin des années 1970. Le changement induit par cette idée d’innovation et sur lequel s’attarde ce mémoire porte non pas sur l’usage ou la destination de la science, mais sur l’extension du raisonnement économique. Celui-ci ne survient pas à l’étape du développement, après que la recherche ait été effectuée en vertu du modèle de la « Big Science ». Au contraire, il remonte du marché pour s’installer très tôt au stade de la compréhension des mécanismes biologiques et dans un espace qui relève de la propriété collective : le laboratoire public. Le passage du caractère « exogène » à « endogène » de la recherche scientifique publique vis-à-vis de l’économie est au cœur d’une discussion sur l’hégémonie de la logique de marché.
Resumo:
The curated commons is a model in which a flexible library building shell and its infrastructure can respond to the specific time-sensitive needs of differing clients. It applies to faculty research, in particular small science activities (as opposed to big science activities that have major support which includes proprietary laboratories and facilities). It provides for sustained transformation of library facilities as well as its utilitarian and cyber-infrastructures to become a flexible reconfigurable space with cutting edge technology and sustained funding streams.
Resumo:
Cloud computing offers massive scalability and elasticity required by many scien-tific and commercial applications. Combining the computational and data handling capabilities of clouds with parallel processing also has the potential to tackle Big Data problems efficiently. Science gateway frameworks and workflow systems enable application developers to implement complex applications and make these available for end-users via simple graphical user interfaces. The integration of such frameworks with Big Data processing tools on the cloud opens new oppor-tunities for application developers. This paper investigates how workflow sys-tems and science gateways can be extended with Big Data processing capabilities. A generic approach based on infrastructure aware workflows is suggested and a proof of concept is implemented based on the WS-PGRADE/gUSE science gateway framework and its integration with the Hadoop parallel data processing solution based on the MapReduce paradigm in the cloud. The provided analysis demonstrates that the methods described to integrate Big Data processing with workflows and science gateways work well in different cloud infrastructures and application scenarios, and can be used to create massively parallel applications for scientific analysis of Big Data.
Resumo:
The generation of heterogeneous big data sources with ever increasing volumes, velocities and veracities over the he last few years has inspired the data science and research community to address the challenge of extracting knowledge form big data. Such a wealth of generated data across the board can be intelligently exploited to advance our knowledge about our environment, public health, critical infrastructure and security. In recent years we have developed generic approaches to process such big data at multiple levels for advancing decision-support. It specifically concerns data processing with semantic harmonisation, low level fusion, analytics, knowledge modelling with high level fusion and reasoning. Such approaches will be introduced and presented in context of the TRIDEC project results on critical oil and gas industry drilling operations and also the ongoing large eVacuate project on critical crowd behaviour detection in confined spaces.
Resumo:
A pesar de la existencia de una multitud de investigaciones sobre el análisis de sentimiento, existen pocos trabajos que traten el tema de su implantación práctica y real y su integración con la inteligencia de negocio y big data de tal forma que dichos análisis de sentimiento estén incorporados en una arquitectura (que soporte todo el proceso desde la obtención de datos hasta su explotación con las herramientas de BI) aplicada a la gestión de la crisis. Se busca, por medio de este trabajo, investigar cómo se pueden unir los mundos de análisis (de sentimiento y crisis) y de la tecnología (todo lo relacionado con la inteligencia de negocios, minería de datos y Big Data), y crear una solución de Inteligencia de Negocios que comprenda la minería de datos y el análisis de sentimiento (basados en grandes volúmenes de datos), y que ayude a empresas y/o gobiernos con la gestión de crisis. El autor se ha puesto a estudiar formas de trabajar con grandes volúmenes de datos, lo que se conoce actualmente como Big Data Science, o la ciencia de los datos aplicada a grandes volúmenes de datos (Big Data), y unir esta tecnología con el análisis de sentimiento relacionado a una situación real (en este trabajo la situación elegida fue la del proceso de impechment de la presidenta de Brasil, Dilma Rousseff). En esta unión se han utilizado técnicas de inteligencia de negocios para la creación de cuadros de mandos, rutinas de ETC (Extracción, Transformación y Carga) de los datos así como también técnicas de minería de textos y análisis de sentimiento. El trabajo ha sido desarrollado en distintas partes y con distintas fuentes de datos (datasets) debido a las distintas pruebas de tecnología a lo largo del proyecto. Uno de los datasets más importantes del proyecto son los tweets recogidos entre los meses de diciembre de 2015 y enero de 2016. Los mensajes recogidos contenían la palabra "Dilma" en el mensaje. Todos los twittees fueron recogidos con la API de Streaming del Twitter. Es muy importante entender que lo que se publica en la red social Twitter no se puede manipular y representa la opinión de la persona o entidad que publica el mensaje. Por esto se puede decir que hacer el proceso de minería de datos con los datos del Twitter puede ser muy eficiente y verídico. En 3 de diciembre de 2015 se aceptó la petición de apertura del proceso del impechment del presidente de Brasil, Dilma Rousseff. La petición fue aceptada por el presidente de la Cámara de los Diputados, el diputado Sr. Eduardo Cunha (PMDBRJ), y de este modo se creó una expectativa sobre el sentimiento de la población y el futuro de Brasil. También se ha recogido datos de las búsquedas en Google referentes a la palabra Dilma; basado en estos datos, el objetivo es llegar a un análisis global de sentimiento (no solo basado en los twittees recogidos). Utilizando apenas dos fuentes (Twitter y búsquedas de Google) han sido extraídos muchísimos datos, pero hay muchas otras fuentes donde es posible obtener informaciones con respecto de las opiniones de las personas acerca de un tema en particular. Así, una herramienta que pueda recoger, extraer y almacenar tantos datos e ilustrar las informaciones de una manera eficaz que ayude y soporte una toma de decisión, contribuye para la gestión de crisis.
Resumo:
The sustainability of current harvest practices for high-value Meliaceae can be assessed by quantifying logging intensity and projecting growth and survival by post-logging populations over anticipated intervals between harvests. From 100%-area inventories of big-leaf mahogany (Swietenia macrophylla) covering 204 ha or more at eight logged and unlogged forest sites across southern Brazilian Amazonia, we report generally higher landscape-scale densities and smaller population-level mean diameters in eastern forests compared to western forests, where most commercial stocks survive. Density of trees >= 20 cm diameter varied by two orders of magnitude and peaked at 1.17 ha(-1). Size class frequency distributions appeared unimodal at two high-density sites, but were essentially arnodal or flat elsewhere; diameter increment patterns indicate that populations were multi- or all-aged. At two high-density sites, conventional logging removed 93-95% of commercial trees (>= 45 cm diameter at the time of logging), illegally eliminated 31-47% of sub-merchantable trees, and targeted trees as small as 20 cm diameter. Projected recovery by commercial stems during 30 years after conventional logging represented 9.9-37.5% of initial densities and was highly dependent on initial logging intensity and size class frequency distributions of commercial trees. We simulated post-logging recovery over the same period at all sites according to the 2003 regulatory framework for mahogany in Brazil, which raised the minimum diameter cutting limit to 60 cm and requires retention during the first harvest of 20% of commercial-sized trees. Recovery during 30 years ranged from approximately 0 to 31% over 20% retention densities at seven of eight sites. At only one site where sub-merchantable trees dominated the population did the simulated density of harvestable stems after 30 years exceed initial commercial densities. These results indicate that 80% harvest intensity will not be sustainable over multiple cutting cycles for most populations without silvicultural interventions ensuring establishment and long-term growth of artificial regeneration to augment depleted natural stocks, including repeated tending of outplanted seedlings. Without improved harvest protocols for mahogany in Brazil as explored in this paper, future commercial supplies of this species as well as other high-value tropical timbers are endangered. Rapid changes in the timber industry and land-use in the Amazon are also significant challenges to sustainable management of mahogany. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
A series of laboratory and animal studies examined the use of chemical and biological agents to enhance the digestibility of Rhodes grass (grass) cut at 60 (young) and 100 (mature) days of regrowth and ensiled as big round bales. The treatments included an untreated control (C), a microbial inoculant (I), NaOH, CaO and NaOH plus inoculant (NaOH + I). Inoculant was grown anaerobically, using a starter culture of rumen fluid from cattle given Rhodes grass. Treatments C, 1, NaOH, NaOH + I, were offered separately to twelve dairy heifers, in a 3 X 4 randomized complete block design, repeated twice for each grass silage. C and I had substantial mould growth, compared with no visible mould in NaOH or NaOH + 1. CaO treatment was effective in preventing mould growth, but had little effect on the chemical composition and in sacco digestibility of mature grass silage. NaOH reduced NDF content and increased in sacco digestibility (P < 0.05) but not the in vivo digestibility (P > 0.05) of both mature- and young-grass silage. The effects of other treatments on nutritive value were non-significant at both stages of maturity. NaOH increased the intake of mature-grass silage by 24-26% (P < 0.05), but had little effect on the intake of young-grass silage (P > 0.05). Treatment I consistently reduced grass silage intake (P < 005) for young-grass silage. The findings of these studies show that treating mature Rhodes grass with NaOH will improve its nutritive value and reduce mould growth in conserved herbage. However none of the treatments in this study had any consistently positive effects on the in vivo nutritive value or storage quality of young-grass silage.
Resumo:
Triple negative breast cancer (TNBC) is a particular immunopathological subtype of breast cancer that lacks expression of estrogen and progesterone receptors (ER/PR) and amplification of the human epidermal growth factor receptor 2 (HER2) gene. Characterized by aggressive and metastatic phenotypes and high rates of relapse, TNBC is the only breast cancer subgroup still lacking effective therapeutic options, thus presenting the worst prognosis. The development of targeted therapies, as well as early diagnosis methods, is vital to ensure an adequate and timely therapeutic intervention in patients with TNBC. This review intends to discuss potentially emerging approaches for the diagnosis and treatment of TNBC patients, with a special focus on nano-based solutions that actively target these particular tumors.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação