502 resultados para BENCHMARKS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time varying parameter (TVP) models have enjoyed an increasing popularity in empirical macroeconomics. However, TVP models are parameter-rich and risk over-fitting unless the dimension of the model is small. Motivated by this worry, this paper proposes several Time Varying dimension (TVD) models where the dimension of the model can change over time, allowing for the model to automatically choose a more parsimonious TVP representation, or to switch between different parsimonious representations. Our TVD models all fall in the category of dynamic mixture models. We discuss the properties of these models and present methods for Bayesian inference. An application involving US inflation forecasting illustrates and compares the different TVD models. We find our TVD approaches exhibit better forecasting performance than several standard benchmarks and shrink towards parsimonious specifications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper extends the Nelson-Siegel linear factor model by developing a flexible macro-finance framework for modeling and forecasting the term structure of US interest rates. Our approach is robust to parameter uncertainty and structural change, as we consider instabilities in parameters and volatilities, and our model averaging method allows for investors' model uncertainty over time. Our time-varying parameter Nelson-Siegel Dynamic Model Averaging (NS-DMA) predicts yields better than standard benchmarks and successfully captures plausible time-varying term premia in real time. The proposed model has significant in-sample and out-of-sample predictability for excess bond returns, and the predictability is of economic value.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This guide introduces Data Envelopment Analysis (DEA), a performance measurement technique, in such a way as to be appropriate to decision makers with little or no background in economics and operational research. The use of mathematics is kept to a minimum. This guide therefore adopts a strong practical approach in order to allow decision makers to conduct their own efficiency analysis and to easily interpret results. DEA helps decision makers for the following reasons: - By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement. - By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient. - By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimize the average cost. - By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the last decade, evidence that release of chemical transmitters from astrocytes might modulate neuronal activity (the so-called "gliotransmission") occurs in situ has been extensively provided. Nevertheless, gliotransmission remains a highly debated topic because of the lack of direct morphological and functional evidence. Here we provided new information supporting gliotransmission, by i) deepen knowledge about specific properties of regulated secretion of glutamatergic SLMVs, and ii) investigating the involvement of astrocytes in the transmission of dopamine, a molecule whose interaction with astrocytes is likely to occur, but it's still not proven.¦VGLUT-expressing glutamatergic SLMVs have been previously identified both in situ and in vitro, but description of kinetics of release were still lacking. To elucidate this issue, we took advantage of fluorescent tools (styryl dyes and pHluorin) and adapted experimental paradigms and analysis methods previously developed to study exo-endocytosis and recycling of glutamatergic vesicles at synapses. Parallel use of EPIfluorescence and total internal reflection (TIRF) imaging allowed us to find that exo-endocytosis processes in astrocytes are extremely fast, with kinetics in the order of milliseconds, able to sustain and follow neuronal signalling at synapses. Also, exocytosis of SLMVs is under the control of fast, localized Ca2+ elevations in close proximity of SLMVs and endoplasmatic reticulum (ER) tubules, the intracellular calcium stores. Such complex organization supports the fast stimulus-secretion coupling we described; localized calcium elevations have been recently observed in astrocytes in situ, suggesting that these functional microdomains might be present in the intact tissue. In the second part of the work, we investigated whether astrocytes possess some of the benchmarks of brain dopaminergic cells. It's been known for years that astrocytes are able to metabolize monoamines by the enzymes MAO and COMT, but to date no clear information that glial cells are able to uptake and store monoamines have been provided. Here, we identified a whole apparatus for the storage, degradation and release of monoamines, at the ultrastructural level. Electron microscopy immunohistochemistry allowed us to visualize VMAT2- and dopamine-positive intracellular compartments within astrocytic processes, i.e. dense -core granules and cisterns. These organelles might be responsible for dopamine release and storage, respectively; interestingly, this intracellular distribution is reminiscent of VMAT2 expression in dendrites if neurons, where dopamine release is tonic and plays a role in the regulation of its a basal levels, suggesting that astrocytic VMAT2 is involved in the homeostasis of dopamine in healthy brains of adult mammals.¦Durant cette dernière décennie, de nombreux résultats sur le relâchement des transmetteurs par les astrocytes pouvant modulé l'activité synaptique (gliotransmission) ont été fournis. Néanmoins, la gliotransmission reste un processus encore très débattu, notamment à cause de l'absence de preuves directes, morphologique et fonctionnelle démontrant ce phénomène. Nous présentons dans nos travaux de nombreux résultats confortant l'hypothèse de la gliotransmission, dont i) une étude approfondie sur les propriétés spatiales et temporelles de la sécrétion régulée du glutamate dans les astrocytes, et ii) une étude sur la participation des astrocytes dans la transmission de la dopamine, une neuromodulateur dont l'interaction avec les astrocytes est fortement probable, mais qui n'a encore jamais été prouvée. L'expression des petites vésicules (SLMVs - Synaptic Like Micro Vesicles) glutamatergiques exprimant les transporteurs vésiculaires du glutamate (VGLUTs) dans les astrocytes a déjà été prouvé tant in situ qu'in vitro. Afin de mettre en évidence les propriétés précises de la sécrétion de ces organelles, nous avons adapté à nos études des méthodes expérimentales conçues pour observer les processus de exocytose et endocytose dans les neurones. Les résolutions spatiale et temporelle obtenues, grâce a l'utilisation en parallèle de l'épi fluorescence et de la fluorescence a onde évanescente (TIRF), nous ont permis de montrer que la sécrétion régulée dans les astrocytes est un processus extrêmement rapide (de l'ordre de la milliseconde) et qu'elle est capable de soutenir et de suivre la transmission de signaux entre neurones. Nous avons également découvert que cette sécrétion a lieu dans des compartiments subcellulaires particuliers où nous observons la présence du reticulum endoplasmique (ER) ainsi que des augmentations rapides de calcium. Cette organisation spatiale complexe pourrait être la base morphologique du couplage rapide entre le stimulus et la sécrétion. Par ailleurs, plusieurs études récentes in vivo semblent confirmer l'existence de ces compartiments. Depuis des années nous savons que les astrocytes sont capables de métaboliser les monoamines par les enzymes MAO et COMT. Nous avons donc fourni de nouvelles preuves concernant la présence d'un appareil de stockage dans les astrocytes participant à la dégradation et la libération de monoamines au niveau ultrastructurelle. Grâce à la microscopie électronique, nous avons découvert la présence de compartiments intracellulaires exprimant VMAT2 dans les processus astrocytaires, sous forme de granules et des citernes. Ces organelles pourraient donc être responsables à la fois du relâchement et du stockage de la dopamine. De manière surprenante, cette distribution intracellulaire est similaire aux dendrites des neurones exprimant VMAT2, où la dopamine est libérée de façon tonique permettant d'agir sur la régulation de ses niveaux de base. Ces résultats, suggèrent une certaine participation des VMAT2 présents dans les astrocytes dans le processus d'homéostase de la dopamine dans le cerveau.¦A de nombreuses reprises, dans des émissions scientifiques ou dans des films, il est avancé que les hommes n'utilisent que 10% du potentiel de leur cerveau. Cette légende provient probablement du fait que les premiers chercheurs ayant décrit les cellules du cerveau entre le XIXème et le XXeme siècle, ont montré que les neurones, les cellules les plus connues et étudiées de cet organe, ne représentent seulement que 10% de la totalité des cellules composant du cerveau. Parmi les 90% restantes, les astrocytes sont sans doute les plus nombreuses. Jusqu'au début des années 90, les astrocytes ont été plutôt considérés peu plus que du tissu conjonctif, ayant comme rôles principaux de maintenir certaines propriétés physiques du cerveau et de fournir un support métabolique (énergie, environnement propre) aux neurones. Grace à la découverte que les astrocytes ont la capacité de relâcher des substances neuro-actives, notamment le glutamate, le rôle des astrocytes dans le fonctionnement cérébral a été récemment reconsidérée.¦Le rôle du glutamate provenant des astrocytes et son impact sur la fonctionnalité des neurones n'a pas encore été totalement élucidé, malgré les nombreuses publications démontrant l'importance de ce phénomène en relation avec différentes fonctions cérébrales. Afin de mieux comprendre comment les astrocytes sont impliqués dans la transmission cérébrale, nous avons étudié les propriétés spatio-temporelles de cette libération grâce à l'utilisation des plusieurs marqueurs fluorescents combinée avec différentes techniques d'imagerie cellulaires. Nous avons découvert que la libération du glutamate par les astrocytes (un processus maintenant appelé "gliotransmission") était très rapide et contrôlée par des augmentations locales de calcium. Nous avons relié ces phénomènes à des domaines fonctionnels subcellulaires morphologiquement adaptés pour ce type de transmission. Plus récemment, nous avons concentré nos études sur un autre transmetteur très important dans le fonctionnement du cerveau : la dopamine. Nos résultats morphologiques semblent indiquer que les astrocytes ont la capacité d'interagir avec ce transmetteur, mais d'une manière différente comparée au glutamate, notamment en terme de rapidité de transmission. Ces résultats suggèrent que le astrocytes ont la capacité de modifier leurs caractéristiques et de s'adapter à leur environnement par rapport aux types de transmetteur avec lequel ils doivent interagir.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4−year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard−setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: potential toxic and safety hazards of nanomaterials throughout their lifecycles; fate and persistence of nanoparticles in humans, animals and the environment; risks associated to nanoparticle exposure; participation in the preparation of nomenclature, standards, methodologies, protocols and benchmarks; development of best practice guidelines; voluntary schemes on responsibility; databases of materials, research topics and themes. Findings show that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Consequently NIN will encourage stakeholders to be active members. These survey findings will be used to improve NIN's communication tools to further build on interdisciplinary relationships towards a healthy future with nanotechnology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Performance prediction and application behavior modeling have been the subject of exten- sive research that aim to estimate applications performance with an acceptable precision. A novel approach to predict the performance of parallel applications is based in the con- cept of Parallel Application Signatures that consists in extract an application most relevant parts (phases) and the number of times they repeat (weights). Executing these phases in a target machine and multiplying its exeuction time by its weight an estimation of the application total execution time can be made. One of the problems is that the performance of an application depends on the program workload. Every type of workload affects differently how an application performs in a given system and so affects the signature execution time. Since the workloads used in most scientific parallel applications have dimensions and data ranges well known and the behavior of these applications are mostly deterministic, a model of how the programs workload affect its performance can be obtained. We create a new methodology to model how a program’s workload affect the parallel application signature. Using regression analysis we are able to generalize each phase time execution and weight function to predict an application performance in a target system for any type of workload within predefined range. We validate our methodology using a synthetic program, benchmarks applications and well known real scientific applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Advances in nebulizer design have produced both ultrasonic nebulizers and devices based on a vibrating mesh (vibrating mesh nebulizers), which are expected to enhance the efficiency of aerosol drug therapy. The aim of this study was to compare 4 different nebulizers, of 3 different types, in an in vitro model using albuterol delivery and physical characteristics as benchmarks. METHODS: The following nebulizers were tested: Sidestream Disposable jet nebulizer, Multisonic Infra Control ultrasonic nebulizer, and the Aerogen Pro and Aerogen Solo vibrating mesh nebulizers. Aerosol duration, temperature, and drug solution osmolality were measured during nebulization. Albuterol delivery was measured by a high-performance liquid chromatography system with fluorometric detection. The droplet size distribution was analyzed with a laser granulometer. RESULTS: The ultrasonic nebulizer was the fastest device based on the duration of nebulization; the jet nebulizer was the slowest. Solution temperature decreased during nebulization when the jet nebulizer and vibrating mesh nebulizers were used, but it increased with the ultrasonic nebulizer. Osmolality was stable during nebulization with the vibrating mesh nebulizers, but increased with the jet nebulizer and ultrasonic nebulizer, indicating solvent evaporation. Albuterol delivery was 1.6 and 2.3 times higher with the ultrasonic nebulizer and vibrating mesh nebulizers devices, respectively, than with the jet nebulizer. Particle size was significantly higher with the ultrasonic nebulizer. CONCLUSIONS: The in vitro model was effective for comparing nebulizer types, demonstrating important differences between nebulizer types. The new devices, both the ultrasonic nebulizers and vibrating mesh nebulizers, delivered more aerosolized drug than traditional jet nebulizers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assays that measure a patient's immune response play an increasingly important role in the development of immunotherapies. The inherent complexity of these assays and independent protocol development between laboratories result in high data variability and poor reproducibility. Quality control through harmonization--based on integration of laboratory-specific protocols with standard operating procedures and assay performance benchmarks--is one way to overcome these limitations. Harmonization guidelines can be widely implemented to address assay performance variables. This process enables objective interpretation and comparison of data across clinical trial sites and also facilitates the identification of relevant immune biomarkers, guiding the development of new therapies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projecte sobre la comparativa de cinc solucions de virtualització centrant-nos especialment en el rendiment de les màquines virtualitzades, extraient informació d'aquestes mitjançant la utilització de benchmarks i posterior anàlisi de les dades obtingudes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The infinite slope method is widely used as the geotechnical component of geomorphic and landscape evolution models. Its assumption that shallow landslides are infinitely long (in a downslope direction) is usually considered valid for natural landslides on the basis that they are generally long relative to their depth. However, this is rarely justified, because the critical length/depth (L/H) ratio below which edge effects become important is unknown. We establish this critical L/H ratio by benchmarking infinite slope stability predictions against finite element predictions for a set of synthetic two-dimensional slopes, assuming that the difference between the predictions is due to error in the infinite slope method. We test the infinite slope method for six different L/H ratios to find the critical ratio at which its predictions fall within 5% of those from the finite element method. We repeat these tests for 5000 synthetic slopes with a range of failure plane depths, pore water pressures, friction angles, soil cohesions, soil unit weights and slope angles characteristic of natural slopes. We find that: (1) infinite slope stability predictions are consistently too conservative for small L/H ratios; (2) the predictions always converge to within 5% of the finite element benchmarks by a L/H ratio of 25 (i.e. the infinite slope assumption is reasonable for landslides 25 times longer than they are deep); but (3) they can converge at much lower ratios depending on slope properties, particularly for low cohesion soils. The implication for catchment scale stability models is that the infinite length assumption is reasonable if their grid resolution is coarse (e.g. >25?m). However, it may also be valid even at much finer grid resolutions (e.g. 1?m), because spatial organization in the predicted pore water pressure field reduces the probability of short landslides and minimizes the risk that predicted landslides will have L/H ratios less than 25. Copyright (c) 2012 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this publication is to present the Executive Summary for the Program Year 2002 report on Iowa’s adult basic education program benchmarks. The passage of the Workforce Investment Act (WIA) of 1998 [Public Law 105-220] by the 105th Congress has ushered in a new era of collaboration, coordination, cooperation and accountability. The overall goal of the Act is “to increase the employment, retention, and earnings of participants, and increase occupational skill attainment by participants, and, as a result improve the quality of the workforce, reduce welfare dependency, and enhance the productivity and competitiveness of the Nation.”

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the major intents of AEFLA was to establish performance measures and benchmarks to demonstrate increased accountability in line with the major goals and objectives of WIA. Section 212(2)(A) of the Act specifies that each eligible agency (e.g. The Iowa Department of Education) is subject to certain core indicators of performance and has the authority to specify additional indicators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this publication is to present the Executive Summary of the first annual report on the results of Iowa’s adult basic education program benchmarks. The passage of the Workforce Investment Act (WIA) of 1998 [Public Law 105-220] by the 105th Congress has ushered in a new era of collaboration, coordination, cooperation and accountability. The overall goal of the Act is “to increase the employment, retention, and earnings of participants, and increase occupational skill attainment by participants, and, as a result improve the quality of the workforce, reduce welfare dependency, and enhance the productivity and competitiveness of the Nation.”