841 resultados para Robotic benchmarks
Resumo:
Assays that measure a patient's immune response play an increasingly important role in the development of immunotherapies. The inherent complexity of these assays and independent protocol development between laboratories result in high data variability and poor reproducibility. Quality control through harmonization--based on integration of laboratory-specific protocols with standard operating procedures and assay performance benchmarks--is one way to overcome these limitations. Harmonization guidelines can be widely implemented to address assay performance variables. This process enables objective interpretation and comparison of data across clinical trial sites and also facilitates the identification of relevant immune biomarkers, guiding the development of new therapies.
Resumo:
Fault tolerance has become a major issue for computer and software engineers because the occurrence of faults increases the cost of using a parallel computer. RADIC is the fault tolerance architecture for message passing systems which is transparent, decentralized, flexible and scalable. This master thesis presents the methodology used to implement the RADIC architecture over Open MPI, a well-know large-used message passing library. This implementation kept the RADIC architecture characteristics. In order to validate the implementation we have executed a synthetic ping program, besides, to evaluate the implementation performance we have used the NAS Parallel Benchmarks. The results prove that the RADIC architecture performance depends on the communication pattern of the parallel application which is running. Furthermore, our implementation proves that the RADIC architecture could be implemented over an existent message passing library.
Resumo:
A number of studies show that New Public Management reforms have altered the current identity benchmarks of public officials, particularly by hybridizing values or management practices. However, existing studies have largely glossed over the sense of belonging of officials when their organization straddles the concerns of public service and private enterprise, so that the boundary between public and private sector is blurred. The purpose of this article is precisely to explore this sense of belonging in the context of organizational hybridization. It does so by drawing on the results of research conducted among the employees of a public unemployment insurance fund in Switzerland. On the one hand, the analysis shows how much their markers of belonging are hybrid, multiple and constructed in negative terms (with regard to the State), while indicating that the working practices of the employees point to an identity that is nevertheless closely bound with the public sector. On the other hand, the analysis shows that the organization plays strategically with its State status, by exploiting either its private or public identity in line with the needs related to its external image. The article concludes with a discussion of the results highlighting the strategic functionality of the hybrid identity of the actors.
Resumo:
En aquest projecte s’ha estudiat el disseny d’una plataforma robòtica mòbil per un PBL (Aprenentatge Basat en Problemes) en enginyeria informàtica. El principal objectiu és introduir aquest model en l’ensenyament universitari, com a complement de diferents assignatures de primer curs. Per arribar a aconseguir aquests objectius, s’ha dissenyat i construït una plataforma robòtica, dirigida per un microcontrolador i dotada de diversos sensors per interactuar amb l’entorn. El robot permet diferents tipus de programació i esta especialment dissenyada per ser una bona experiència educativa.
Resumo:
Incorporating adaptive learning into macroeconomics requires assumptions about how agents incorporate their forecasts into their decision-making. We develop a theory of bounded rationality that we call finite-horizon learning. This approach generalizes the two existing benchmarks in the literature: Eulerequation learning, which assumes that consumption decisions are made to satisfy the one-step-ahead perceived Euler equation; and infinite-horizon learning, in which consumption today is determined optimally from an infinite-horizon optimization problem with given beliefs. In our approach, agents hold a finite forecasting/planning horizon. We find for the Ramsey model that the unique rational expectations equilibrium is E-stable at all horizons. However, transitional dynamics can differ significantly depending upon the horizon.
Resumo:
In this paper we examine the out-of-sample forecast performance of high-yield credit spreads regarding real-time and revised data on employment and industrial production in the US. We evaluate models using both a point forecast and a probability forecast exercise. Our main findings suggest the use of few factors obtained by pooling information from a number of sector-specific high-yield credit spreads. This can be justified by observing that, especially for employment, there is a gain from using a principal components model fitted to high-yield credit spreads compared to the prediction produced by benchmarks, such as an AR, and ARDL models that use either the term spread or the aggregate high-yield spread as exogenous regressor. Moreover, forecasts based on real-time data are generally comparable to forecasts based on revised data. JEL Classification: C22; C53; E32 Keywords: Credit spreads; Principal components; Forecasting; Real-time data.
Resumo:
Time varying parameter (TVP) models have enjoyed an increasing popularity in empirical macroeconomics. However, TVP models are parameter-rich and risk over-fitting unless the dimension of the model is small. Motivated by this worry, this paper proposes several Time Varying dimension (TVD) models where the dimension of the model can change over time, allowing for the model to automatically choose a more parsimonious TVP representation, or to switch between different parsimonious representations. Our TVD models all fall in the category of dynamic mixture models. We discuss the properties of these models and present methods for Bayesian inference. An application involving US inflation forecasting illustrates and compares the different TVD models. We find our TVD approaches exhibit better forecasting performance than several standard benchmarks and shrink towards parsimonious specifications.
Resumo:
This paper extends the Nelson-Siegel linear factor model by developing a flexible macro-finance framework for modeling and forecasting the term structure of US interest rates. Our approach is robust to parameter uncertainty and structural change, as we consider instabilities in parameters and volatilities, and our model averaging method allows for investors' model uncertainty over time. Our time-varying parameter Nelson-Siegel Dynamic Model Averaging (NS-DMA) predicts yields better than standard benchmarks and successfully captures plausible time-varying term premia in real time. The proposed model has significant in-sample and out-of-sample predictability for excess bond returns, and the predictability is of economic value.
Resumo:
This guide introduces Data Envelopment Analysis (DEA), a performance measurement technique, in such a way as to be appropriate to decision makers with little or no background in economics and operational research. The use of mathematics is kept to a minimum. This guide therefore adopts a strong practical approach in order to allow decision makers to conduct their own efficiency analysis and to easily interpret results. DEA helps decision makers for the following reasons: - By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement. - By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient. - By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimize the average cost. - By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices.
Resumo:
During the last decade, evidence that release of chemical transmitters from astrocytes might modulate neuronal activity (the so-called "gliotransmission") occurs in situ has been extensively provided. Nevertheless, gliotransmission remains a highly debated topic because of the lack of direct morphological and functional evidence. Here we provided new information supporting gliotransmission, by i) deepen knowledge about specific properties of regulated secretion of glutamatergic SLMVs, and ii) investigating the involvement of astrocytes in the transmission of dopamine, a molecule whose interaction with astrocytes is likely to occur, but it's still not proven.¦VGLUT-expressing glutamatergic SLMVs have been previously identified both in situ and in vitro, but description of kinetics of release were still lacking. To elucidate this issue, we took advantage of fluorescent tools (styryl dyes and pHluorin) and adapted experimental paradigms and analysis methods previously developed to study exo-endocytosis and recycling of glutamatergic vesicles at synapses. Parallel use of EPIfluorescence and total internal reflection (TIRF) imaging allowed us to find that exo-endocytosis processes in astrocytes are extremely fast, with kinetics in the order of milliseconds, able to sustain and follow neuronal signalling at synapses. Also, exocytosis of SLMVs is under the control of fast, localized Ca2+ elevations in close proximity of SLMVs and endoplasmatic reticulum (ER) tubules, the intracellular calcium stores. Such complex organization supports the fast stimulus-secretion coupling we described; localized calcium elevations have been recently observed in astrocytes in situ, suggesting that these functional microdomains might be present in the intact tissue. In the second part of the work, we investigated whether astrocytes possess some of the benchmarks of brain dopaminergic cells. It's been known for years that astrocytes are able to metabolize monoamines by the enzymes MAO and COMT, but to date no clear information that glial cells are able to uptake and store monoamines have been provided. Here, we identified a whole apparatus for the storage, degradation and release of monoamines, at the ultrastructural level. Electron microscopy immunohistochemistry allowed us to visualize VMAT2- and dopamine-positive intracellular compartments within astrocytic processes, i.e. dense -core granules and cisterns. These organelles might be responsible for dopamine release and storage, respectively; interestingly, this intracellular distribution is reminiscent of VMAT2 expression in dendrites if neurons, where dopamine release is tonic and plays a role in the regulation of its a basal levels, suggesting that astrocytic VMAT2 is involved in the homeostasis of dopamine in healthy brains of adult mammals.¦Durant cette dernière décennie, de nombreux résultats sur le relâchement des transmetteurs par les astrocytes pouvant modulé l'activité synaptique (gliotransmission) ont été fournis. Néanmoins, la gliotransmission reste un processus encore très débattu, notamment à cause de l'absence de preuves directes, morphologique et fonctionnelle démontrant ce phénomène. Nous présentons dans nos travaux de nombreux résultats confortant l'hypothèse de la gliotransmission, dont i) une étude approfondie sur les propriétés spatiales et temporelles de la sécrétion régulée du glutamate dans les astrocytes, et ii) une étude sur la participation des astrocytes dans la transmission de la dopamine, une neuromodulateur dont l'interaction avec les astrocytes est fortement probable, mais qui n'a encore jamais été prouvée. L'expression des petites vésicules (SLMVs - Synaptic Like Micro Vesicles) glutamatergiques exprimant les transporteurs vésiculaires du glutamate (VGLUTs) dans les astrocytes a déjà été prouvé tant in situ qu'in vitro. Afin de mettre en évidence les propriétés précises de la sécrétion de ces organelles, nous avons adapté à nos études des méthodes expérimentales conçues pour observer les processus de exocytose et endocytose dans les neurones. Les résolutions spatiale et temporelle obtenues, grâce a l'utilisation en parallèle de l'épi fluorescence et de la fluorescence a onde évanescente (TIRF), nous ont permis de montrer que la sécrétion régulée dans les astrocytes est un processus extrêmement rapide (de l'ordre de la milliseconde) et qu'elle est capable de soutenir et de suivre la transmission de signaux entre neurones. Nous avons également découvert que cette sécrétion a lieu dans des compartiments subcellulaires particuliers où nous observons la présence du reticulum endoplasmique (ER) ainsi que des augmentations rapides de calcium. Cette organisation spatiale complexe pourrait être la base morphologique du couplage rapide entre le stimulus et la sécrétion. Par ailleurs, plusieurs études récentes in vivo semblent confirmer l'existence de ces compartiments. Depuis des années nous savons que les astrocytes sont capables de métaboliser les monoamines par les enzymes MAO et COMT. Nous avons donc fourni de nouvelles preuves concernant la présence d'un appareil de stockage dans les astrocytes participant à la dégradation et la libération de monoamines au niveau ultrastructurelle. Grâce à la microscopie électronique, nous avons découvert la présence de compartiments intracellulaires exprimant VMAT2 dans les processus astrocytaires, sous forme de granules et des citernes. Ces organelles pourraient donc être responsables à la fois du relâchement et du stockage de la dopamine. De manière surprenante, cette distribution intracellulaire est similaire aux dendrites des neurones exprimant VMAT2, où la dopamine est libérée de façon tonique permettant d'agir sur la régulation de ses niveaux de base. Ces résultats, suggèrent une certaine participation des VMAT2 présents dans les astrocytes dans le processus d'homéostase de la dopamine dans le cerveau.¦A de nombreuses reprises, dans des émissions scientifiques ou dans des films, il est avancé que les hommes n'utilisent que 10% du potentiel de leur cerveau. Cette légende provient probablement du fait que les premiers chercheurs ayant décrit les cellules du cerveau entre le XIXème et le XXeme siècle, ont montré que les neurones, les cellules les plus connues et étudiées de cet organe, ne représentent seulement que 10% de la totalité des cellules composant du cerveau. Parmi les 90% restantes, les astrocytes sont sans doute les plus nombreuses. Jusqu'au début des années 90, les astrocytes ont été plutôt considérés peu plus que du tissu conjonctif, ayant comme rôles principaux de maintenir certaines propriétés physiques du cerveau et de fournir un support métabolique (énergie, environnement propre) aux neurones. Grace à la découverte que les astrocytes ont la capacité de relâcher des substances neuro-actives, notamment le glutamate, le rôle des astrocytes dans le fonctionnement cérébral a été récemment reconsidérée.¦Le rôle du glutamate provenant des astrocytes et son impact sur la fonctionnalité des neurones n'a pas encore été totalement élucidé, malgré les nombreuses publications démontrant l'importance de ce phénomène en relation avec différentes fonctions cérébrales. Afin de mieux comprendre comment les astrocytes sont impliqués dans la transmission cérébrale, nous avons étudié les propriétés spatio-temporelles de cette libération grâce à l'utilisation des plusieurs marqueurs fluorescents combinée avec différentes techniques d'imagerie cellulaires. Nous avons découvert que la libération du glutamate par les astrocytes (un processus maintenant appelé "gliotransmission") était très rapide et contrôlée par des augmentations locales de calcium. Nous avons relié ces phénomènes à des domaines fonctionnels subcellulaires morphologiquement adaptés pour ce type de transmission. Plus récemment, nous avons concentré nos études sur un autre transmetteur très important dans le fonctionnement du cerveau : la dopamine. Nos résultats morphologiques semblent indiquer que les astrocytes ont la capacité d'interagir avec ce transmetteur, mais d'une manière différente comparée au glutamate, notamment en terme de rapidité de transmission. Ces résultats suggèrent que le astrocytes ont la capacité de modifier leurs caractéristiques et de s'adapter à leur environnement par rapport aux types de transmetteur avec lequel ils doivent interagir.
Resumo:
Self-consciousness has mostly been approached by philosophical enquiry and not by empirical neuroscientific study, leading to an overabundance of diverging theories and an absence of data-driven theories. Using robotic technology, we achieved specific bodily conflicts and induced predictable changes in a fundamental aspect of self-consciousness by altering where healthy subjects experienced themselves to be (self-location). Functional magnetic resonance imaging revealed that temporo-parietal junction (TPJ) activity reflected experimental changes in self-location that also depended on the first-person perspective due to visuo-tactile and visuo-vestibular conflicts. Moreover, in a large lesion analysis study of neurological patients with a well-defined state of abnormal self-location, brain damage was also localized at TPJ, providing causal evidence that TPJ encodes self-location. Our findings reveal that multisensory integration at the TPJ reflects one of the most fundamental subjective feelings of humans: the feeling of being an entity localized at a position in space and perceiving the world from this position and perspective.
Resumo:
NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4−year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard−setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: potential toxic and safety hazards of nanomaterials throughout their lifecycles; fate and persistence of nanoparticles in humans, animals and the environment; risks associated to nanoparticle exposure; participation in the preparation of nomenclature, standards, methodologies, protocols and benchmarks; development of best practice guidelines; voluntary schemes on responsibility; databases of materials, research topics and themes. Findings show that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Consequently NIN will encourage stakeholders to be active members. These survey findings will be used to improve NIN's communication tools to further build on interdisciplinary relationships towards a healthy future with nanotechnology.
Resumo:
Purpose of reviewThis review provides information and an update on stereotactic radiosurgery (SRS) equipment, with a focus on intracranial lesions and brain neoplasms.Recent findingsGamma Knife radiosurgery represents the gold standard for intracranial radiosurgery, using a dedicated equipment, and has recently evolved with a newly designed technology, Leksell Gamma Knife Perfexion. Linear accelerator-based radiosurgery is more recent, and originally based on existing systems, either adapted or dedicated to radiosurgery. Equipment incorporating specific technologies, such as the robotic CyberKnife system, has been developed. Novel concepts in radiation therapy delivery techniques, such as intensity-modulated radiotherapy, were also developed; their integration with computed tomography imaging and helical delivery has led to the TomoTherapy system. Recent data on the management of intracranial tumors with radiosurgery illustrate the trend toward a larger use and acceptance of this therapeutic modality.SummarySRS has become an important alternative treatment for a variety of lesions. Each radiosurgery system has its advantages and limitations. The 'perfect' and ubiquitous system does not exist. The choice of a radiosurgery system may vary with the strategy and needs of specific radiosurgery programs. No center can afford to acquire every technology, and strategic choices have to be made. Institutions with large neurosurgery and radiation oncology programs usually have more than one system, allowing optimization of the management of patients with a choice of open neurosurgery, radiosurgery, and radiotherapy. Given its minimally invasive nature and increasing clinical acceptance, SRS will continue to progress and offer new advances as a therapeutic tool in neurosurgery and radiotherapy.
Resumo:
Performance prediction and application behavior modeling have been the subject of exten- sive research that aim to estimate applications performance with an acceptable precision. A novel approach to predict the performance of parallel applications is based in the con- cept of Parallel Application Signatures that consists in extract an application most relevant parts (phases) and the number of times they repeat (weights). Executing these phases in a target machine and multiplying its exeuction time by its weight an estimation of the application total execution time can be made. One of the problems is that the performance of an application depends on the program workload. Every type of workload affects differently how an application performs in a given system and so affects the signature execution time. Since the workloads used in most scientific parallel applications have dimensions and data ranges well known and the behavior of these applications are mostly deterministic, a model of how the programs workload affect its performance can be obtained. We create a new methodology to model how a program’s workload affect the parallel application signature. Using regression analysis we are able to generalize each phase time execution and weight function to predict an application performance in a target system for any type of workload within predefined range. We validate our methodology using a synthetic program, benchmarks applications and well known real scientific applications.
Resumo:
BACKGROUND: Advances in nebulizer design have produced both ultrasonic nebulizers and devices based on a vibrating mesh (vibrating mesh nebulizers), which are expected to enhance the efficiency of aerosol drug therapy. The aim of this study was to compare 4 different nebulizers, of 3 different types, in an in vitro model using albuterol delivery and physical characteristics as benchmarks. METHODS: The following nebulizers were tested: Sidestream Disposable jet nebulizer, Multisonic Infra Control ultrasonic nebulizer, and the Aerogen Pro and Aerogen Solo vibrating mesh nebulizers. Aerosol duration, temperature, and drug solution osmolality were measured during nebulization. Albuterol delivery was measured by a high-performance liquid chromatography system with fluorometric detection. The droplet size distribution was analyzed with a laser granulometer. RESULTS: The ultrasonic nebulizer was the fastest device based on the duration of nebulization; the jet nebulizer was the slowest. Solution temperature decreased during nebulization when the jet nebulizer and vibrating mesh nebulizers were used, but it increased with the ultrasonic nebulizer. Osmolality was stable during nebulization with the vibrating mesh nebulizers, but increased with the jet nebulizer and ultrasonic nebulizer, indicating solvent evaporation. Albuterol delivery was 1.6 and 2.3 times higher with the ultrasonic nebulizer and vibrating mesh nebulizers devices, respectively, than with the jet nebulizer. Particle size was significantly higher with the ultrasonic nebulizer. CONCLUSIONS: The in vitro model was effective for comparing nebulizer types, demonstrating important differences between nebulizer types. The new devices, both the ultrasonic nebulizers and vibrating mesh nebulizers, delivered more aerosolized drug than traditional jet nebulizers.