951 resultados para Significance-driven computing
Resumo:
Purpose: The purpose of this paper is to analyse the impact of business exits on future dimensions of entrepreneurial activity at the macroeconomic level. Design/methodology/approach: This research uses the Global Entrepreneurship Monitor (GEM) data for 41 countries and the Generalized Method of Moments (GMM) to carry out the analysis. The paper differentiates the effect of the two components of total entrepreneurial activity, and the two motivations for it – opportunity and necessity entrepreneurship. Findings: The results presented here show a positive and significant effect of the coefficient associated with exits in all models. This means that the levels of entrepreneurial activity exceed business exits. The robustness of the models are tested, including other variables such as the fear of failure, the Gross Domestic Product, role models, entrepreneurial skills and the unemployment variables. The main hypothesis which stated that at national level business exits imply greater rates of opportunity-driven entrepreneurship is corroborated. Originality/value: One would expect that unemployment rates would imply higher levels of necessity entrepreneurship. However, results show that unemployment rates do in fact favour opportunity entrepreneurship levels. This could be due to those government policies that are aimed at promoting entrepreneurship through the capitalization of unemployment to be totally invested in a new start-up. To the best of our knowledge, this is the first panel data study to link previous exit rates to future dimensions of entrepreneurial activity. Keywords: Entrepreneurship, business exits, social values, industrial organization Paper type: Research paper
Resumo:
How a stimulus or a task alters the spontaneous dynamics of the brain remains a fundamental open question in neuroscience. One of the most robust hallmarks of task/stimulus-driven brain dynamics is the decrease of variability with respect to the spontaneous level, an effect seen across multiple experimental conditions and in brain signals observed at different spatiotemporal scales. Recently, it was observed that the trial-to-trial variability and temporal variance of functional magnetic resonance imaging (fMRI) signals decrease in the task-driven activity. Here we examined the dynamics of a large-scale model of the human cortex to provide a mechanistic understanding of these observations. The model allows computing the statistics of synaptic activity in the spontaneous condition and in putative tasks determined by external inputs to a given subset of brain regions. We demonstrated that external inputs decrease the variance, increase the covariances, and decrease the autocovariance of synaptic activity as a consequence of single node and large-scale network dynamics. Altogether, these changes in network statistics imply a reduction of entropy, meaning that the spontaneous synaptic activity outlines a larger multidimensional activity space than does the task-driven activity. We tested this model's prediction on fMRI signals from healthy humans acquired during rest and task conditions and found a significant decrease of entropy in the stimulus-driven activity. Altogether, our study proposes a mechanism for increasing the information capacity of brain networks by enlarging the volume of possible activity configurations at rest and reliably settling into a confined stimulus-driven state to allow better transmission of stimulus-related information.
Resumo:
Se realizó un estudio transversal, se incluyeron 3 residentes no cardiólogos y se les dio formación básica en ecocardiografía (horas teóricas 22, horas prácticas 65), con recomendaciones de la Sociedad Americana de Ecocardiografia y aportes del aprendizaje basado en problemas, con el desarrollo de competencia técnicas y diagnósticas necesarias, se realizó el análisis de concordancia entre residentes y ecocardiografistas expertos, se recolectaron 122 pacientes hospitalizados que cumplieran con los criterios de inclusión y exclusión, se les realizo un ecocardiograma convencional por el experto y una valoración ecocardiográfica por el residente, se evaluó la ventana acústica, contractilidad, función del ventrículo izquierdo y derrame pericárdico. La hipótesis planteada fue obtener una concordancia moderada. Resultados: Se analizó la concordancia entre observadores para la contractilidad miocárdica (Kappa: 0,57 p=0,000), función sistólica del ventrículo izquierdo (Kappa 0,54 p=0.000) siendo esta moderada por estar entre 0,40 – 0,60 y con una alta significancia estadística, para la calidad de la ventana acústica (Kappa: 0,22 p= 0.000) y presencia de derrame pericárdico (Kappa: 0,26 p= 0.000) se encontró una escasa concordancia ubicándose entre 0,20 – 0,40. Se estableció una sensibilidad de 90%, especificidad de 67%, un valor predictivo positivo de 80% y un valor predictivo negativo de 85% para el diagnóstico de disfunción sistólica del ventrículo izquierdo realizado por los residentes.
Resumo:
Esta tesis está dividida en dos partes: en la primera parte se presentan y estudian los procesos telegráficos, los procesos de Poisson con compensador telegráfico y los procesos telegráficos con saltos. El estudio presentado en esta primera parte incluye el cálculo de las distribuciones de cada proceso, las medias y varianzas, así como las funciones generadoras de momentos entre otras propiedades. Utilizando estas propiedades en la segunda parte se estudian los modelos de valoración de opciones basados en procesos telegráficos con saltos. En esta parte se da una descripción de cómo calcular las medidas neutrales al riesgo, se encuentra la condición de no arbitraje en este tipo de modelos y por último se calcula el precio de las opciones Europeas de compra y venta.
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.
Resumo:
This paper presents optical and electrical measurements on plasma generated by DC excited glow discharges in mixtures composed of 95% N2, 4.8% CH4 and 0.2% H2O at pressures varying from 1.064 mbar to 4.0 mbar. The discharges simulate the chemical reactions that may occur in Titan's atmosphere in the presence of meteorites and ice debris coming from Saturn's systems, assisted by cosmic rays and high energy charged particles. The results obtained from actinometric optical emission spectroscopy, combined with the results from a pulsed Langmuir probe, show that chemical species CH, CN, NH and OH are important precursors in the synthesis of the final solid products and that the chemical kinetics is essentially driven by electronic collision processes. It is shown that the presence of water is sufficient to produce complex solid products whose components are important in prebiotic compound synthesis. © 1998 Elsevier Science Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The diagnosis of T-cell large granular lymphocytic leukemia in association with other B-cell disorders is uncommon but not unknown. However, the concomitant presence of three hematological diseases is extraordinarily rare. We report an 88-year-old male patient with three simultaneous clonal disorders, that is, CD4+/CD8(weak) T-cell large granular lymphocytic leukemia, monoclonal gammopathy of unknown significance and monoclonal B-cell lymphocytosis. The patient has only minimal complaints and has no anemia, neutropenia or thrombocytopenia. Lymphadenopathy and hepatosplenomegaly were not present. The three disorders were characterized by flow cytometry analysis, and the clonality of the T-cell large granular lymphocytic leukemia was confirmed by polymerase chain reaction. Interestingly, the patient has different B-cell clones, given that plasma cells of monoclonal gammopathy of unknown significance exhibited a kappa light-chain restriction population and, on the other hand, B-lymphocytes of monoclonal B-cell lymphocytosis exhibited a lambda light-chain restriction population. This finding does not support the antigen-driven hypothesis for the development of multi-compartment diseases, but suggests that T-cell large granular lymphocytic expansion might represent a direct antitumor immunological response to both B-cell and plasma-cell aberrant populations, as part of the immune surveillance against malignant neoplasms.
Resumo:
Negli ultimi anni si sente sempre più spesso parlare di cloud computing. L'idea di fondo di questo concetto è quella di pagare per il solo effettivo utilizzo di un servizio, disponibile sulla rete, avendo a disposizione la possibilità di poter variare le proprie risorse utilizzabili a seconda delle necessità, che potrebbero essere, per esempio, applicazioni standard oppure spazi di storage per i dati. Quando cominciò a diffondersi l'utilizzo del Web, la rete Internet veniva raffigurata come una nuvola (cloud) in modo tale che si rendesse l'idea di un'entità esterna rispetto alla nostra casa o al nostro posto di lavoro, un qualcosa cioè al di fuori dei luoghi abituali in cui vengono utilizzati i PC. Tale rappresentazione diventa ora utile per poter spiegare il concetto di cloud computing. Infatti, grazie a questa nuova tecnologia, dati e programmi normalmente presenti nei nostri computer potranno ora trovarsi sul cloud. Molti reparti IT sono costretti a dedicare una parte significativa del loro tempo a progetti di implementazione, manutenzione e upgrade che spesso non danno un vero valore per l'azienda. I team di sviluppo hanno cominciato quindi a rivolgersi a questa nuova tecnologia emergente per poter minimizzare il tempo dedicato ad attività a basso valore aggiunto per potersi concentrare su quelle attività strategiche che possono fare la differenza per un'azienda. Infatti un'infrastruttura come quella cloud computing promette risparmi nei costi amministrativi che raggiungono addirittura il 50% rispetto ad un software standard di tipo client/server. Questa nuova tecnologia sta dando inizio ad un cambiamento epocale nel mondo dello sviluppo delle applicazioni. Il passaggio che si sta effettuando verso le nuove soluzioni cloud computing consente infatti di creare applicazioni solide in tempi decisamente più brevi e con costi assai inferiori, evitando inoltre tutte le seccature associate a server, soluzioni software singole, aggiornamenti, senza contare il personale necessario a gestire tutto questo. L'obiettivo di questa tesi è quello di mostrare una panoramica della progettazione e dello sviluppo di applicazioni Web nel cloud computing, analizzandone pregi e difetti in relazione alle soluzioni software attuali. Nel primo capitolo viene mostrato un quadro generale in riferimento al cloud, mettendo in luce le sue caratteristiche fondamentali, esaminando la sua architettura e valutando vantaggi e svantaggi di tale piattaforma. Nel secondo capitolo viene presentata la nuova metodologia di progettazione nel cloud, operando prima di tutto un confronto con lo sviluppo dei software standard e analizzando poi l'impatto che il cloud computing opera sulla progettazione. Nel terzo capitolo si entra nel merito della progettazione e sviluppo di applicazioni SaaS, specificandone le caratteristiche comuni ed elencando le piattaforme di rilievo allo stato dell'arte. Si entrerà inoltre nel merito della piattaforma Windows Azure. Nel quarto capitolo viene analizzato nel particolare lo sviluppo di applicazioni SaaS Multi-Tenant, specificando livelli e caratteristiche, fino a spiegare le architetture metadata-driven. Nel quinto capitolo viene operato un confronto tra due possibili approcci di sviluppo di un software cloud, analizzando nello specifico le loro differenze a livello di requisiti non funzionali. Nel sesto capitolo, infine, viene effettuata una panoramica dei costi di progettazione di un'applicazione cloud.
Resumo:
Web is constantly evolving, thanks to the 2.0 transition, HTML5 new features and the coming of cloud-computing, the gap between Web and traditional desktop applications is tailing off. Web-apps are more and more widespread and bring several benefits compared to traditional ones. On the other hand reference technologies, JavaScript primarly, are not keeping pace, so a paradim shift is taking place in Web programming, and so many new languages and technologies are coming out. First objective of this thesis is to survey the reference and state-of-art technologies for client-side Web programming focusing in particular on what concerns concurrency and asynchronous programming. Taking into account the problems that affect existing technologies, we finally design simpAL-web, an innovative approach to tackle Web-apps development, based on the Agent-oriented programming abstraction and the simpAL language. == Versione in italiano: Il Web è in continua evoluzione, grazie alla transizione verso il 2.0, alle nuove funzionalità introdotte con HTML5 ed all’avvento del cloud-computing, il divario tra le applicazioni Web e quelle desktop tradizionali va assottigliandosi. Le Web-apps sono sempre più diffuse e presentano diversi vantaggi rispetto a quelle tradizionali. D’altra parte le tecnologie di riferimento, JavaScript in primis, non stanno tenendo il passo, motivo per cui la programmazione Web sta andando incontro ad un cambio di paradigma e nuovi linguaggi e tecnologie stanno spuntando sempre più numerosi. Primo obiettivo di questa tesi è di passare al vaglio le tecnologie di riferimento ed allo stato dell’arte per quel che riguarda la programmmazione Web client-side, porgendo particolare attenzione agli aspetti inerenti la concorrenza e la programmazione asincrona. Considerando i principali problemi di cui soffrono le attuali tecnologie passeremo infine alla progettazione di simpAL-web, un approccio innovativo con cui affrontare lo sviluppo di Web-apps basato sulla programmazione orientata agli Agenti e sul linguaggio simpAL.
Resumo:
We propose a computationally efficient and biomechanically relevant soft-tissue simulation method for cranio-maxillofacial (CMF) surgery. A template-based facial muscle reconstruction was introduced to minimize the efforts on preparing a patient-specific model. A transversely isotropic mass-tensor model (MTM) was adopted to realize the effect of directional property of facial muscles in reasonable computation time. Additionally, sliding contact around teeth and mucosa was considered for more realistic simulation. Retrospective validation study with postoperative scan of a real patient showed that there were considerable improvements in simulation accuracy by incorporating template-based facial muscle anatomy and sliding contact.
Immediate Search in the IDE as an Example of Socio-Technical Congruence in Search-Driven Development
Resumo:
Search-driven development is mainly concerned with code reuse but also with code navigation and debugging. In this essay we look at search-driven navigation in the IDE. We consider Smalltalk-80 as an example of a programming system with search-driven navigation capabilities and explore its human factors. We present how immediate search results lead to a user experience of code browsing rather than one of waiting for and clicking through search results. We explore the socio-technical congruence of immediate search, ie unification of tasks and breakpoints with method calls, which leads to simpler and more extensible development tools. Eventually we conclude with remarks on the socio-technical congruence of search-driven development.
Resumo:
The evolution of the Next Generation Networks, especially the wireless broadband access technologies such as Long Term Evolution (LTE) and Worldwide Interoperability for Microwave Access (WiMAX), have increased the number of "all-IP" networks across the world. The enhanced capabilities of these access networks has spearheaded the cloud computing paradigm, where the end-users aim at having the services accessible anytime and anywhere. The services availability is also related with the end-user device, where one of the major constraints is the battery lifetime. Therefore, it is necessary to assess and minimize the energy consumed by the end-user devices, given its significance for the user perceived quality of the cloud computing services. In this paper, an empirical methodology to measure network interfaces energy consumption is proposed. By employing this methodology, an experimental evaluation of energy consumption in three different cloud computing access scenarios (including WiMAX) were performed. The empirical results obtained show the impact of accurate network interface states management and application network level design in the energy consumption. Additionally, the achieved outcomes can be used in further software-based models to optimized energy consumption, and increase the Quality of Experience (QoE) perceived by the end-users.
Resumo:
We describe a system for performing SLA-driven management and orchestration of distributed infrastructures composed of services supporting mobile computing use cases. In particular, we focus on a Follow-Me Cloud scenario in which we consider mobile users accessing cloud-enable services. We combine a SLA-driven approach to infrastructure optimization, with forecast-based performance degradation preventive actions and pattern detection for supporting mobile cloud infrastructure management. We present our system's information model and architecture including the algorithmic support and the proposed scenarios for system evaluation.